BenQ’s XL2730Z ‘FreeSync’ monitor reviewed

I get a little bit too excited about monitors these days, to be honest. The market for desktop PC displays has traditionally been sleepy and slow-moving, but things have changed in the past year or so. Pixel densities and display quality are up generally, and the prices for big LCD panels have been dropping at the same time. Yet in our latest TR Hardware Survey, over two thirds of our readers are rocking a main desktop monitor resolution of 1920×1200 or lower. The time has come for an awful lot of folks to consider an upgrade.

That said, perhaps the single biggest reason for PC gamers to consider upgrading their displays isn’t size, pixel count, or contrast ratio. Nope, it’s variable refresh technology. You may have already heard us wax poetic about the smooth animation made possible by Nvidia’s G-Sync. AMD has been promising to release a competing standard under the clever name FreeSync, with the “free” implying an open standard and lower costs than Nvidia’s proprietary tech. One of the first FreeSync monitors, the BenQ XL2730Z, arrived in Damage Labs not long ago, and we’ve been spending some quality time with it since to see how it handles.

The short answer: it’s buttery smooth, just like you’d hope. Read on for our take on the current state of FreeSync, how it compares to G-Sync, and the particulars of this BenQ monitor.

Variable display refresh: the story so far

The need for variable-refresh displays stems from the fact that today’s LCD monitors generally operate on principles established back in the CRT era, most notably the notion of a fixed update rate. The vast majority of electronic displays update themselves on a fixed cadence, usually 60 times per second (or 60Hz). Gaming-oriented displays are sometimes faster, but any display with a fixed refresh rate introduces a problem for gaming animation known as quantization.

Put simply, quantization introduces fixed steps into a more variable pattern of information. You’ve heard the effects of quantization in the autotune algorithms now used by apparently every pop singer. This same “roughness” applies visually when games produce frames at irregular intervals and display quantization maps them to fixed intervals. Here’s an example from AMD illustrating what happens when a frame isn’t ready for display at the end of a single refresh cycle.

Display quantization illustrated. Source: AMD.

With a fixed refresh rate of 60Hz, the frame-to-frame interval would be 16.7 milliseconds. If a new frame isn’t ready after one of those intervals—even if it’s ready in 16.8 ms—the display will show the prior frame again, and the user will have to wait until another whole interval has passed before seeing new information. Total wait time: 33.3 ms, twice what you’d usually expect—and the equivalent of 30Hz.

Rendering times vary from frame to frame even on the fastest graphics cards.

Now, say that GPU got really bogged down for some reason and a new frame wasn’t ready for 33.5 ms. You’d have to wait three intervals, or a total of 50 ms, for the next frame to be displayed. That’s a fairly punishing wait, one that would likely interrupt the illusion of animated motion in the game. Not only does it take time for the updated frame to reach the screen, but the frame that eventually gets displayed is essentially out of date by the time it reaches the user’s eyes.

These 16.7-ms steps are the basis of quantization on a 60Hz display, and they unavoidably drain some of the fluidity out of real-time graphics. Quantization can make a fast GPU seem slower than it really is by exaggerating the impact of small delays in frame production. By increasing the time it takes for new frames of animation to reach the display, quantization also increases input lag, the total time between user input and a visual response.

PC gamers have often attempted to avoid the worst effects of display quantization by disabling the video card’s synchronization with the display. With vertical refresh synchronization (vsync) disabled, the video card will shift to a new frame even while the display is being updated. Allowing these mid-refresh updates to happen can bring fresh information to the user’s eyes sooner, but it does so at the cost of image integrity. The seams between one frame and the next can been seen onscreen, an artifact known as tearing.

An example of the tearing that happens without vertical refresh synchronization.

Yeah, it’s kind of ugly, and tearing can be downright annoying at times. Beyond that, going without vsync doesn’t change the fact that the display only updates every so often.

Fortunately, today’s LCD monitors don’t need to follow a fixed refresh interval. Within certain limits, at least, LCDs can wait for the GPU to produce a completed new frame before updating the screen. Here’s how AMD’s FreeSync presentation illustrates variable refresh at work.

Variable refresh in action. Source: AMD.

The display is updated right when a new frame is ready, so in the case of our example above, there’s no need to wait 33.3 ms for a frame that takes 16.8 ms to render. You only wait the extra tenth of a millisecond and then paint the screen. Quantization is replaced by much smoother animation.

Obviously, variable refresh rates aren’t a fix for everything. A slow GPU or CPU can still introduce frame production hiccups the user will feel. But eliminating the quantization effect does have a very nice, easily appreciable impact on a 60Hz display. I’m pleased to see this technology coming to PC gaming. It’s yet another example of innovation happening in this space that will likely trickle into other markets later.

Sorting out the names and brands

Right now, unfortunately, we’re faced with competing standards for variable refresh displays. You can’t just buy a monitor with that feature, connect it to any graphics card, and turn on the eye candy spigot.

Nvidia was first to market with G-Sync technology, which is based on the firm’s own home-brewed display logic module. Monitor makers must buy Nvidia’s module in order to build a G-Sync display. Then, displays with G-Sync technology can only provide variable refresh rates when used in concert with a newer GeForce card—basically a GeForce GTX 600-series model or newer. Currently, there’s a handful of decent G-Sync displays available for purchase, but they generally come with a price premium attached.

Meanwhile, AMD has taken a rather different tack with its FreeSync effort by attempting to create an industry-wide standard for variable refresh displays. The firm persuaded the VESA standards board to approve an extension to the DisplayPort spec known as Adaptive-Sync. This spec is open for the entire industry to adopt at no extra cost and is meant to ensure compatibility between GPUs and monitors capable of variable refresh rates.

Next, AMD worked with some of the biggest players in the display logic business, helping them to implement Adaptive-Sync capabilities. Three firms, Realtek, MStar, and NovaTek, have built Adaptive-Sync support into their monitor control chips—and they’ve apparently done so without incorporating a big chunk of DRAM like Nvidia built into its G-Sync module.

While those efforts were underway, the folks at AMD also encouraged a host of display manufacturers to build monitors with Adaptive-Sync support. Our subject today, the BenQ XL2730Z, is one of those products. As part of its FreeSync initiative, AMD has offered to certify monitors for proper variable-refresh operation at no cost to the display makers. The firm will then lend its FreeSync brand to those monitors that work properly—although FreeSync branding is by no means necessary for a display to be Adaptive-Sync compliant.

Got all that?

Thanks to its open, collaborative approach and the use of display logic chips from incumbent providers, AMD expects Adaptive-Sync support to add very little to the cost of building a display.

Of course, in order to make it work, you’ll need a Radeon graphics card to pair with the monitor. Right now, only certain Radeon GPUs have the necessary hardware to support variable refresh, including Hawaii, Tonga, and Bonaire. Radeon R7 260/X, R9 285, and R9 290/X cards should be good to go, but the R9 270/X and 280/X aren’t. One would hope for broader support among current cards, but AMD continues to sell some, uh, well-worn graphics chips aboard is current products.

One other caveat: multi-GPU configs aren’t yet supported. AMD has pledged to release drivers that enable CrossFire multi-GPU with FreeSync some time this month.

 

BenQ’s XL2730Z: among the first

The BenQ XL2730Z is right in the sweet spot of what I’d want out of a PC gaming display. It combines a speedy 144Hz peak refresh rate with a 2560×1440 panel that measures 27″ from corner to corner. Those specs closely mirror the vitals of the ROG Swift PG278Q, one of my favorite gaming displays to date. In fact, I wouldn’t be surprised if these two monitors were based on the same brand and model of LCD panel.

Panel size 27″ diagonal
Native resolution 2560×1440
Aspect ratio 16:9
Panel type/backlight TN/LED
Refresh rate 40-144Hz; variable via
Adaptive-Sync
Display colors 16.7 million
Max brightness 350 cd/m²
Peak contrast ratio 1000:1
Optimal viewing angles 170° horizontal, 160° vertical
Response time (Gray to gray) 1 ms
Display surface Matte anti-glare
HDCP support Yes
Inputs 1 x DisplayPort 1.2,

1 x DVI-DL,

1 x HDMI 2.0,

1 x HDMI 1.4,

1 x D-sub,

1 x USB 3.0,

1 x headphone,

1 x mic

Outputs 2 x USB 3.0,

1 x headphone,

1 x mic

Peak power draw 65W
Wall mount support VESA 100 x 100 mm
Weight 16.5 lbs (7.5 kg)

Aside from its support for a different variable-refresh standard, which happens exclusively through the DisplayPort connection, the XL2730Z has a whole array of conventional port types, like HDMI and DVI. Most G-Sync monitors rely solely on DisplayPort, I believe due to the limitations of Nvidia’s module.

Thanks to AMD’s more collaborative approach, the XL2730Z is very much BenQ’s own creation, and it’s packed with features that should be familiar from the company’s other gaming-centric displays, things like Black eQualizer and multiple custom game profiles. As we’ll see, that’s kind of a mixed blessing.

Right now, the XL2730Z is selling for $629.99 at Newegg, which is indeed cheaper than the competition. The G-Sync-based Asus ROG Swift PG278Q is going for $779.99. Thing is, neither one is exactly cheap. You can pick up a 27″ monitor with the same resolution based on IPS technology with a 60Hz refresh rate for under $400. If you’re not paying extra for FreeSync hardware, you’re still paying extra for the cachet—and for silky-smooth 144Hz refresh rates.

With that lengthy introduction out of the way, let’s jump right into a look at how this FreeSync display performs in games.

The FreeSync experience

Setting up FreeSync is dead simple. You just tick a checkbox in the Catalyst Control Center enabling FreeSync, and off you go. On the XL2730Z, the display’s refresh rate varies from a minimum of 40Hz to a peak of 144Hz. In other words, the intervals between frames range from 25 ms to 6.94 ms dynamically on a per-frame basis.

Once it’s enabled and you’re running a game, one thing is clear: AMD and BenQ have succeeded in delivering the same sort of creamy smooth animation that we know from G-Sync displays. The fluidity is easy to discern. Gaming with variable refresh is a gratifying, visceral thing—a heightened version of the core experience that makes action-oriented games so addictive. You really do have to see a fast variable-refresh display in person in order to fully appreciate it.

We can’t replicate the experience in a video shown on a conventional display, but we can slow things down in order to illustrate the difference in animation smoothness. I’ve created a series of comparison videos, shot at 240 frames per second on an iPhone 6 and played back in slow-motion, that shows the XL2730Z in its 144Hz-peak variable refresh mode compared to other options. The first one pits this mode against a 60Hz refresh rate with vsync enabled. You may want to play the video full-screen in order to get a good look.

The difference in fluidity is dramatic, especially at the edges of the screen, where objects are moving the fastest in our example scene. There are still some occasional hitches where animation isn’t perfect on the 144Hz FreeSync config. Those are likely the result of slowdowns somewhere else in the system, perhaps caused by CPU or GPU performance limitations. Slowdowns do occasionally still happen with variable refresh, but they shouldn’t be the fault of the display.

Our second example, above, compares variable refresh on the BenQ XL2730Z to a 60Hz display mode with vsync disabled. Again, the 144Hz FreeSync setup looks stellar. I don’t think turning off vsync makes the 60Hz display mode look much smoother, and without vsync, you can sometimes see tearing, especially in the hillside and the trees.

If you already have a 144Hz gaming monitor, adding variable refresh to the mix isn’t as big a deal as it would be otherwise. The quantization effects of a seven-millisecond interval between frames aren’t as dramatic as with 16.7-ms steps. Both of these slow-mo videos look quite a bit nicer than our 60Hz examples. Still, I can detect some waggle (or unevenness) in the movement of the hillside in the 144Hz vsync video that’s not present with FreeSync enabled. I think we’re running into some GPU limitations here, too. We’ve seen clearer examples of variable refresh’s superiority at 144Hz in Skyrim in the past. The effect is subtle but discernible.

 

Trouble brewing? What happens at the edges?

One intriguing question about FreeSync displays is what they do when they reach the edges of their refresh rate ranges. As I’ve noted, the XL2730Z can vary from 40Hz to 144Hz. To be a little more precise, it can tolerate frame-to-frame intervals between 25 and 6.94 milliseconds. What happens when the frames come in from the GPU at shorter or longer intervals?

AMD has built some flexibility into FreeSync’s operation: the user can choose whether to enable or disable vsync for frames that exceed the display’s timing tolerance. Consider what happens if frames are coming in from the GPU too quickly for the display to keep up. With vsync enabled, the display will wait a full 6.94 ms before updating the screen, possibly discarding excess frames. (G-Sync always behaves in this manner.) With vsync disabled, the display will go ahead and update the screen mid-refresh, getting the freshest information to the user’s eyes while potentially introducing a tearing artifact. Since variable refresh is active, the screen will only tear when the frame rate goes above or below the display’s refresh range.

Giving users the option of enabling vsync in this situation is a smart move, one that I fully expect Nvidia to copy in future versions of G-Sync.

The trickier issue is what happens when the GPU’s frame rate drops below the display’s minimum refresh rate. I’ve seen some confusion and incorrect information at other publications about exactly how FreeSync handles this situation, so I took some time to look into it.

As you may know, LCD panels must be refreshed every so often in order for the pixels to maintain their state. Wait too long, and the pixel will lose its charge and drift back to its original color—usually white or gray, I believe. Variable-refresh schemes must cope with this limitation; they can’t wait forever for the next frame from the GPU before painting the screen again.

Some reports have suggested that when the frame-to-frame interval on a FreeSync display grows too long, the display responds by “locking” into a 40Hz refresh rate, essentially quantizing updates at multiples of 25 ms. Doing so would be pretty poor behavior, because quantization at 25 ms steps would mean horribly janky animation. You’d be making the worst of an already bad situation where the attached PC was running up against its own performance limitations. However, such talk is kind of nonsense on the face of it, since we’re dealing with a variable-refresh display working in concert with a GPU that’s producing frames at an irregular rate. What happens in such cases differs between FreeSync and G-Sync, but neither solution’s behavior is terribly problematic.

Let’s start with how G-Sync handles it. I talked with Nvidia’s Tom Petersen about this question, since he’s made some public comments on this matter that I wanted to understand.


Such talk is kind of nonsense on the face of it, since we’re dealing with a variable-refresh display working in concert with a GPU that’s producing frames at an irregular rate.

Petersen explained that sorting out the timing of a variable-refresh scheme can be daunting when the wait for a new frame from the graphics card exceeds the display’s maximum wait time. The obvious thing to do is to refresh the display again with a copy of the last frame. Trouble is, the very act of painting the screen takes some time, and it’s quite possible the GPU have a new frame ready while the refresh is taking place. If that happens, you have a collision, with two frames contending for the same resource.

Nvidia has built some logic into its G-Sync control module that attempts to avoid such collisions. This logic uses a moving average of the past couple of GPU frame times in order to estimate what the current GPU frame-to-frame interval is likely to be. If the estimated interval is expected to exceed the display’s max refresh time, the G-Sync module will preemptively refresh the display part way through the wait, rather than letting the LCD reach the point where it must be refreshed immediately.

This preemptive refresh “recharges” the LCD panel and extends its ability to wait for the next GPU frame. If the next frame arrives in about the same time as the last one, then this “early” refresh should pay off by preventing a collision between a new frame and a gotta-have-it-now refresh.

I asked AMD’s David Glen, one of the engineers behind FreeSync, about how AMD’s variable-refresh scheme handles this same sort of low-FPS scenario. The basic behavior is similar to G-Sync’s. If the wait for a new frame exceeds the display’s tolerance, Glen said, “we show the frame again, and show it at the max rate the monitor supports.” Once the screen has been painted, which presumably takes less than 6.94 ms on a 144Hz display, the monitor should be ready to accept a new frame at any time.

What FreeSync apparently lacks is G-Sync’s added timing logic to avoid collisions. However, FreeSync is capable of operating with vsync disabled outside of the display’s refresh range. In the event of a collision with a required refresh, Glen pointed out, FreeSync can optionally swap to a new frame in the middle of that refresh. So FreeSync is not without its own unique means of dealing with collisions. Then again, the penalty for a collision with vsync enabled should be pretty minor. (My sense is that FreeSync should just paint the screen again with the new frame as soon as the current refresh ends.)

Everything I’ve just explained may seem terribly complicated, but the bottom line is straightforward. FreeSync’s logic for handling low-FPS situations isn’t anywhere near as bad as some folks have suggested, and it isn’t all that different from G-Sync’s. Nvidia’s method of avoiding collisions seems like it might be superior in some ways, but we’re talking about small differences.

You can see a difference between FreeSync and G-Sync in a contrived scenario involving a fixed frame rate below 40Hz. To record the video above, I ran Nvidia’s “Pendulum” demo side by side on the XL2730Z and a G-Sync display, with the demo locked to 30 FPS on both systems. In this case, G-Sync’s collision avoidance logic looks to be pretty effective, granting a marginal improvement in animation smoothness over the BenQ FreeSync monitor. (In most browsers, you can play the video at 60 FPS via YouTube’s quality settings. Doing so will give you a more accurate sense of the motion happening here.)

The video above was shot with vsync enabled on the FreeSync display. If you turn off vsync, you’ll see lots of tearing—an indication there are quite a few collisions happening in this scenario.

When playing a real game, though, the frame times are more likely to look something like the plot above most of the time—not a perfectly spaced sequence of frames, but a varied progression that makes collisions less likely.

Testing the practical impact of these differences in real games is tough. Nothing good is happening when your average frame rate is below 40 FPS, with bottlenecks other than the display’s behavior coming into play. Sorting out what’s a GPU slowdown and what’s a display collision or quantization isn’t always easy.

Still, I made an attempt in several games intensive enough to push our R9 290X below 40 FPS. Far Cry 4 was just a stutter-fest, with obvious system-based bottlenecks, when I cranked up the image quality. Crysis 3, on the other hand, was reasonably playable at around 35 FPS.

In fact, playing it was generally a good experience on the XL2730Z. I’ve seen low-refresh quantization effects before (by playing games on one of those 30Hz-only 4K monitors), and there was simply no sign of it here. I also had no sense of a transition happening when the frame rate momentarily ranged above 40Hz and then dipped back below it. The experience was seamless and reasonably fluid, even with vsync enabled for “out of bounds” frame intervals, which is how I prefer to play. My sense is that, both in theory and in practice, FreeSync handles real-world gaming situations at lower refresh rates in perfectly acceptable fashion. In fact, my satisfaction with this experience is what led me to push harder to understand everything I’ve explained above.

Remember, also, that we’re talking about what happens when frame rates get too low. If you tune your image quality settings right, the vast majority of PC gaming should happen between 40 and 144Hz, not below the 40Hz threshold.

 

Ghosting and persistence

Another bone of contention in the FreeSync-versus-G-Sync wars is the question of ghosting, those display after-images that you can sometimes see on LCDs. Turns out that the BenQ XL2730Z in particular has a ghosting issue in a very prominent scenario: AMD’s own “Windmill” FreeSync demo. Below is a side-by-side slow-motion video that shows the Asus PG278Q versus the BenQ.

Watch the trailing edge of the windmill when the arm is moving the fastest—in the same direction as the base of the windmill—in order to see the ghost image. The problem is readily apparent on the BenQ display, but not on the Asus G-Sync monitor. For those who can’t be bothered to watch the video, here’s a single frame that tells the story.


The ROG Swift PG278Q (left) vs. the BenQ XL2730Z (right)

This isn’t exactly the worst problem in the world, but some ghosting is apparent on the XL2730Z. That’s true of the video, and if anything, the after-image is even easier to see in person. My first look at FreeSync in action was this demo, so seeing ghosting right away was a bit disappointing.

Here’s the thing to realize: AMD’s demo team has managed to concoct one heck of a scenario to bring out ghosting. The scene is high contrast, the blades sweep across the screen quickly, and the ghosts come out to play. I’ve spent some time with the XL2730Z, playing games and running the UFO tests and such, and this sort of ghosting isn’t nearly as apparent on the XL2730Z in other cases.

There’s a bit of after-image visible in the UFO ghosting test, but it’s pretty minimal and wouldn’t raise any red flags during our usual display testing routine.

Ghosting is almost entirely impossible to detect with the naked eye in this relatively high-contrast scene from Borderlands: The Pre-Sequel. I recorded this video at 240 FPS, and it plays back at half of game speed, so I was really sweeping the mouse around quickly. (60 FPS playback is available via YouTube.) Some small amount of ghosting is visible in this slow-mo video, but even at half speed, you have to watch carefully to see it.

The XP2730Z does have some ghosting issues that are perceptible in certain cases, but they are not especially common or distracting overall, in my view. I’ve seen much worse from cheaper monitors in the past. The ghosting issue has become a bit of a hot topic in part because, again, Nvidia has hinted that ghosting may be worse with FreeSync displays than with G-Sync displays.

I asked Nvidia’s Tom Petersen about this issue, and he explained that maintaining the correct overdrive voltage needed to reduce ghosting on an LCD with variable refresh rates is a tricky challenge, one that Nvidia worked to overcome in the development of its G-Sync module. I think that’s a credible claim.

When I asked AMD’s Robert Hallock for his take, he responded with: “In our view, ghosting is one part firmware, one part panel, and zero parts FreeSync.” I think that also is a credible statement.

The difference here is that with AMD’s collaborative approach, the burden for ensuring the correct overdrive voltage falls to the makers of the monitor and the display logic chip. Since Nvidia has built its own display logic chip, the responsibility for tuning the panel voltage on G-Sync monitors falls mostly on Nvidia’s shoulders.

The fact that we’ve seen some ghosting on BenQ’s XL2730Z doesn’t necessarily indicate there will be a general issue with ghosting on FreeSync displays. We’ll have to wait and watch to see how other FreeSync monitors perform before we can generalize. It’s up to the makers of FreeSync monitors and logic chips to tackle this problem.

Speaking of which, BenQ has built a blur-reduction feature into the XL2730Z that can be enabled via the monitor’s settings menu. Turn it on, and you get low-persistence display mode that strobes the backlight very much like the ULMB mode built into the ROG Swift PG278Q. This feature does reduce ghosting and increase the clarity of objects in motion, but it also quite visibly lowers the screen brightness. Cranking up the brightness can offset that effect, to some extent. As with the ULMB mode on G-Sync displays, BenQ’s blur-reduction mode is not compatible with variable refresh rates, so it’s probably more of a curiosity than anything.

 

All the trimmings

As sweet as variable refresh can be, the XL2730Z doesn’t rely on just a single feature. It’s loaded with extras, many of them aimed squarely at gamers. Let’s zoom in on that collection of ports on its side. What’s that little red doohickey?

Push in on the red thing, and a slender, metal bar slides out of the side of the monitor, ready to act as a storage spot for your gaming headset. Nifty, I’ve gotta say. You can even plug your headphones’ audio and mic into the monitor’s two dedicated ports, which act as pass-throughs for audio over DisplayPort and HDMI—no sound card needed. I listened to a little music on my headphones via this connection, and the sound quality seemed quite decent.

The l33tn355 continues with the dual USB 3.0 SuperSpeed ports connected to the BenQ’s internal USB hub. None of these touches are necessary in a good display, but they certainly add to the XL2730Z’s cachet.

Should you decide to use the XL2730Z for something other than gaming with variable refresh, it’s capable of working with virtually anything with a display output, from classics like ye olde VGA and DL-DVI to the new hotness of HDMI 2.0.

This monitor’s stand tilts, swivels, slides, and pivots in just about any way you might want, with plenty of leeway in each direction. The flexibility includes 5.5″ of height adjustment, 25° worth of tilt, 45° of swivel in each direction, and 90° worth of pivot into portrait mode. The stand attach point conforms to VESA standards, so the monitor can be mounted on alternative hardware if the user so desires.

Really, the only possible premium feature not included here is a set of internal speakers, and those tend to be useful only as a last resort.

Menus, a puck, and some wonky defaults

The XL2730Z’s premium vibe is almost lost when it comes time to make an adjustment to one of the monitor’s settings. You’re confronted with a series of five identical buttons on the front of the monitor, each one mapped to an adjacent on-screen navigational icon.

Yes, most monitor control menus work this way, but I don’t like any of them. They’re hard to navigate and make tweaking a pain. The situation isn’t helped by the fact that BenQ has packed in a ton of menu options, some of questionable value.

Happily, the menu nav situation is redeemed by the funky device pictured above. This control “puck” isn’t a mouse, but its scroll wheel and buttons make navigating through on-screen menus quick and elegant. There’s a recessed spot in the monitor’s base where the puck can rest, or it can be placed anywhere on the nearby desktop for easy access.

The numbered buttons on the puck allow for quick switching into three different game modes, collections of user-definable display settings. The on-screen menus offer access to another five profiles under different labels. I can’t imagine wanting to have a separate monitor profile for each game or application that I use, but I suppose some folks might appreciate the option.

Some of BenQ’s other choices, though, leave me cold. The XL2730Z’s default setting for sharpness is too aggressive, so it produces some strange aliasing around ClearType fonts. I was able to fix the problem by dialing back the sharpness in the menu, but the monitor comes out of the box with some quirky behavior.

 

Brightness and contrast

Speaking of quirky behavior, the XL2730Z has a feature with the unintentionally hilarious name of Black eQualizer. BenQ claims it can “brighten dark scenes without over-exposing the bright areas.” Sounds a bit gimmicky and more than a bit racist, but whatever. I figured I could play around with later and see what it did. Then I went to test the display at its default settings, and, well, look at this gamma response measurement.

What is supposed to be a nearly flat line at about 2.2 across the board is instead a logarithmic curve. I’m pretty sure that’s the Black eQualizer feature at work, attempting to make sure that no baddies can hide in the dark shadows during a game. Trouble is, this feature is enabled by default on the XL2370Z, and it doesn’t come without a price. This gamma response curve has negative consequences for black levels and contrast ratios, which is no surprise when you think about it. This feature intentionally reduces color fidelity in order to do its thing.

I was able to disable Black eQualizer in the monitor’s settings, but frustratingly, the feature kept coming back on after I calibrated the display. Ultimately, I had to disable ADC in my calibration software in order to keep Black eQualizer from resurrecting itself. Once this feature was turned off, the BenQ’s overall fidelity and contrast improved, bringing it closer in line with the ROG Swift PG278Q, which is based on a similar LCD panel.

Because it’s a default setting rather than a menu option, the Black eQualizer goes from being a harmless gimmick to something worse. This “feature” makes the XL2730Z a less capable display, out of the box, and requires intentional tuning to overcome.

We put in that work before taking the measurements below, so the BenQ display was able to put its best foot forward. Black eQualizer was disabled and the display was calibrated prior to these tests. The other monitors were calibrated, as well.

As you can see, the XL2730Z compares favorably on this front to the two other monitors we have on hand for comparison. We’ve already introduced the ROG Swift PG278Q, the BenQ’s obvious rival based on G-Sync and perhaps the exact same LCD panel. Our other contestant, the Asus PB278Q, is the same size and resolution as the two variable-refresh monitors but is based on an IPS-type panel, generally considered the standard for image quality among LCDs.

Color reproduction

Click through the buttons below to see the color gamut ranges for the displays, both before and after calibration. Color gamut has to do with the range of colors the display can produce, and it can vary widely from one monitor to the next. The gray triangle on each diagram below represents the standard sRGB color space.


The XL2730Z’s gamut almost completely encompasses the sRGB color space. The IPS-based PB278Q is capable of displaying some deeper reds and purples than our two TN panels, but those hues are largely beyond the bounds of the sRGB standard.


The BenQ monitor’s default color temperature isn’t far off of our 6500K target, and then it snaps into line almost perfectly after calibration.


Remember, these measurements are affected by the Black eQualizer feature I discussed above. I’ve included three sets of results for the BenQ: at the default settings, after calibration with Black eQualizer enabled, and after calibration with Black eQualizer disabled. As you can see, Black eQ wreaks havoc with the monitor’s gamma response even after calibration. Fortunately, turning this feature off yields a nice, flat gamma response across the board.

Delta-E is a measure of color difference—or error—compared to a reference. Smaller delta-E values generally mean more accurate colors. We measured delta-E in the sRGB color space with a D65 white point, both before and after calibration.

Once calibrated (and with Black eQ disabled), the XL2730Z offers the most faithful color reproduction of the group. Reds are the largest source of error for this display, as they are for the PG278Q.

The XL2730Z just barely trails the PG278Q in grayscale color accuracy after calibration, but both outperform the IPS panel we have on hand. This is not your father’s TN panel, folks. It’s pretty darned good.

 

Display uniformity

Displays typically don’t produce the exact same image across their entire surfaces. We’ve quantified uniformity by taking a series of luminance readings in different regions of the panel. We set the brightness level at ~200 cd/m² at the center of the screen before starting.

193 cd/m²

(97%)

192 cd/m²

(96%)

185 cd/m²

(93%)

200 cd/m²

(101%)

199 cd/m²

(100%)

188 cd/m²

(94%)

187 cd/m²

(94%)

184 cd/m²

(92%)

178 cd/m²

(89%)

This amount of variance—11% at most—isn’t anything to worry about. In fact, you’d be hard pressed to notice it in regular use. By way of comparison, we measured the PG278Q’s max variance at 13% from center to edge.

I’ve chosen to convey backlight bleed using a picture rather than a series of measurements. Trouble is, I never know exactly how this image will end up looking on the reader’s own screen. When I look at the XL2730Z displaying a black screen like this, I see a little more light bleeding through the leftmost quarter of the screen area than elsewhere. That bleed increases and takes on a bit of a peachy color if I move my head to the left or right too far. You’ve got to be in a dark room with the brightness turned up a bit in order to see anything but a uniformly black screen, though. Our colorimeter measured a peak difference in black levels of 12% from the the display’s edge to center, which again isn’t bad.

Viewing angles

I’ve taken these pictures in order to demonstrate how the display’s color and contrast shift when the viewing angle changes.

   
 
   

Although this monitor’s TN-type panel does exhibit noticeable color shift, that shift is pretty subtle at common desktop viewing angles. The pictures above illustrate the extent of it. If you were sitting on the floor or standing up and looking down at the display from a sharp angle, you’d see a more dramatic change in color temperature and contrast. For regular desktop use, though, this thing is pretty solid.

Input lag

TN panels tend to be quick, and this one is no exception. The XL2730Z’s gray-to-gray transition time is one millisecond, substantially quicker than the five-millisecond rating for the IPS-based PB278Q.

Input lag comes from many sources, including the scaler chip inside the monitor. To measure lag, we compared the XL2730Z against my old Dell 3007WFP. The 3007WFP’s IPS panel isn’t particularly fast, with an 8-ms gray-to-gray spec, but this monitor has no internal scaler chip, so there’s zero input lag from that source. Both displays were connected to the same graphic card in clone mode while running an on-screen timer at 60Hz, and then we took some high-speed photos.

Dell 3007WFP (left) vs. BenQ XL2730Z (right)

In the example above, the XL2730Z is a single frame behind the 3007WFP at the time of the exposure. The XL2730Z either tied or ran slightly behind the 3007WFP in a series of images we captured. The difference between them is effectively very small.

Now, let’s compare to the G-Sync-infused PG278Q.

ROG Swift PG278Q (left) vs. BenQ XL2730Z (right)

These two displays ran head to head as the counter incremented, pretty much exactly matching one another each time.

Power consumption

The BenQ’s power draw is reasonable overall. It’s a little lower at peak than the PG278Q’s, but the Asus display produces more light at max brightness.

 

My verdict on the BenQ XL2730Z

The BenQ XL2730Z is based on a very high quality TN panel, and it generally outperforms the IPS-based monitor we used for comparison. That’s no fluke; we’ve seen the same behavior from the Asus PG278Q, which we strongly suspect is built around the same TN panel. In fact, in virtually all of our empirical measurements of display quality, the PG278Q and the XL2730Z track together closely. The only respect in which these two are not at least the equal of an IPS panel is off-angle viewing. As we’ve noted, though, the traditional TN color and contrast shifts aren’t readily apparent if you’re sitting at a desk in front of one of these displays. We’re just beginning to see some of the first IPS-based displays with fast, variable refresh rates trickle into the market. The XL2730Z remains an intriguing option regardless.

BenQ has stumbled a bit by giving the XL2730Z some questionable default settings. I’m still shaking my head about the fact that Black eQualizer is turned on by default. These problems can be overcome with proper tuning, but I really wish it weren’t up to the user to undo BenQ’s iffy choices. Those complaints aside, however, the XL2730Z has everything one would want in this class of display product. It’s packed with features, including a wealth of physical adjustments and postures.

Overall, the XL2730Z is indisputably one of the best gaming monitors on the planet. If you have a fairly beefy Radeon graphics card that’s FreeSync capable, this display would make an excellent companion to it. I spent quite a few hours questing for loot in Borderlands: TPS on the XL2730Z and a Radeon R9 290X, and the two working in concert is a wondrous thing. You need to try it for yourself to understand. As I’ve said before, you don’t exactly notice variable refresh when it’s working well, because it’s silky smooth and seamless. What you’ll notice most is how broken everything feels if you have to go back to a fixed refresh rate, especially at 60Hz. Display quantization is an irritant, and you’ll be glad to be rid of it.

Some further thoughts about FreeSync

Spending time with a FreeSync monitor and walking through the gauntlet of supposed issues has crystallized my thoughts about some things. AMD and its partners have succeeded in bringing variable refresh technology to market using an open, collaborative approach. The concerns we’ve seen raised about niggling problems with FreeSync displays in specific cases, such as low-FPS scenarios and ghosting, are really nibbling around the edges. Yes, at the end of the day, the G-Sync-infused Asus ROG Swift PG278Q is slightly superior to the XL2730Z in certain corner cases. But I wouldn’t hesitate to recommend the XL2730Z, especially since it costs less than the Asus. The XL2730Z would be a huge upgrade for most gamers.

In fact, the BenQ XL2730Z is good enough that I think it’s time for the rest of the industry to step up and support the VESA standard for variable refresh rates.

Nvidia has indicated that it intends to continue the development of G-Sync, perhaps adding new capabilities beyond variable refresh. Some of the possibilities—like eliminating the left-to-right, top-to-bottom LCD panel refresh pattern or interpolating between frames during long intervals a la Carmack’s time warp—could improve gaming displays even further. I don’t want to discourage such developments. But there is no technical reason why today’s GeForce GPUs can’t support variable refresh on Adaptive-Sync displays. All it would take is a driver update. If Nvidia really believes G-Sync offers compelling advantages over Adaptive-Sync, it should show that faith by supporting both display types going forward. Let consumers choose.

Heck, I expect Nvidia will be forced to support variable refresh rates without the use of its proprietary module in order to bring G-Sync to laptops. That move could prompt some uncomfortable conversations about why desktop displays still absolutely require that module.

Another quiet player in this drama is Intel, whose integrated graphics processors ship in tons of PCs of all types. Intel has attempted to insert itself into the conversation about PC graphics in recent years by stepping up its driver support and introducing features like PixelSync that have influenced new DirectX versions. I can think of no better way for Intel to signal its commitment to PC gaming than openly backing Adaptive-Sync.

Besides, variable refresh rates can transform marginal GPU performance into a good experience. They’re a great match for the limited horsepower of integrated graphics solutions. A desktop all-in-one with Iris Pro graphics and a variable-refresh display is a pretty compelling prospect.

Surely Intel IGPs already have support for variable refresh rates built in, because VRRs have been a potential source of power savings in laptops for years. (Also, Intel already supports a related technology, panel self-refresh, with similar functional underpinnings.) Assuming there are no weird technical snags holding it back, Intel ought to be writing its press release in support of Adaptive-Sync right now.

I provide updates at variable intervals on Twitter.

Comments closed
    • plonk420
    • 4 years ago

    does this monitor let you set refresh rates of say 48hz (for most films or possibly even if The Hobbit gets released in some kind of perverted format at 48/96fps (the latter for 3D) …not that i condone the 48fps version) or 50hz for deinterlaced European content? (see the VFR release of Top Gear as an example… news and laps and star in a reasonably priced car sections are in 50fps, the rest, 25.)

    • itachi
    • 4 years ago

    One quick question, I read about the max FPS limitation with Gsync and maybe Freesync too, how does that works ? If use 144hz Gsync or freesync, is it not possible to get 144fps ?

    • itachi
    • 4 years ago

    1600p 144hz with freesync and gsync plz, lol.

    • Krogoth
    • 5 years ago

    Let’s have a moment of silence of what it come have been.

    We could have SED and FED monitors now in the marketplace instead of trying to polishing up the known weaknesses of LCDs.

    Too bad that tech is trapped in patent troll hell.

      • cygnus1
      • 4 years ago

      I meant to reply to this earlier, but got distracted. I googled SED and read the wikipedia page. I’m now sad that we don’t have this tech available widely. It pretty much seems like all the good aspects of CRT with all of the good aspects of a plasma or LCD combined into one tech. Maybe one day….

    • Wild Thing
    • 5 years ago

    Wait,so Free-Sync is good and does actually work as advertised?
    Can’t wait to read all the “I was soo wrong” comments about it down below.
    *checks thread*
    Hmm seems the most vocal haters here have lost their tongues….

    • Freon
    • 5 years ago

    Interesting to shed light on the min/max frame intervals affect on experience. It looks like a good min/max spread is going to be a key feature moving forward. Something with a 60-144 might be worse than something with a 40-120 spread if you end up gaming much at ~60fps averages.

    My stopgap is overclocking my Korean IPS/PLS panel display. Only seems to work well at up to 100hz, but at least that mitigates the effects of tearing with vsync off, which is how I still play.
    It’s harder to see tearing the faster the refresh is since any given tear is not visible for as long.

    • Odinson
    • 5 years ago

    I really don’t understand the push for higher resolution displays for gaming. Unless you have really deep pockets I just don’t think its worth it.

    I have one of those Korean 27″ QHD monitors and I love the picture quality but gaming on it is a pain. I have an i5-2500k at 4.4ghz water cooled, 16 gigs of ram at 1600 mhz, a modern SSD, and two MSI power edition 660 ti’s in SLI.

    At 1080p everything is smooth as butter with everything maxed out. 1440p is mostly smooth with some hiccups though.

    Pixels at 1080p = 2073600
    Pixels at 1440p = 3686400

    So 1440p has 1.78 times the pixels of 1080p. It comes out to being a huge performance loss. Personally I would take the improved and smoother performance over the extra pixels. Am I the only one that feels that way?

    I want a 27″, 1080p, g-sync monitor with good picture quality and under $300. I hope it doesn’t take too long for that to happen.

      • sr1030nx
      • 4 years ago

      Not having a 1440p monitor to play with, it’s my understanding that due to the much higher pixel count, it has a similar effect to anti-aliasing.
      If so, you could get a 1440 monitor and not use AA, and still have the same framerates as 1080 with AA.

      On a different note, I’m going to wait to see if any freesync monitors start coming with a lower range at which they’ll work (25/30hz?).

    • TopHatKiller
    • 5 years ago

    Tar’an all for the thoughtful review.
    Am personally gratified that the freesync vs gsync corroborates my previous comment on the matter. However it’s a pipe-dream that Nv will ever adopt freesync, the company’s culture and [i<]Mr Jen-Hung-less[/i<] [stole that: but it's funny] will never allow it. Intel might though [now that AMD will not appear anywhere.] Hope so. Any monitor that supports both standards will have to be priced at the Nv license level. But I suppose that's better then being 'forced' to stay with AMD or NV. Mhmm. All these bloody 120hz+ monitors are just to damn expensive though; Scan.co.uk lists the BenQ at about £480 - and the new ips 144hz Asus mg279 at £490. Incidentally that asus runs freesynch at 35-90hz. As AMD stipulates freesynch can run between 9 & 240 - I just find this limited variable window odd. Tally-ho. Ps. If some nice person would want to send me, ugh a couple of thousand, I could buy several of them and then give you people my real review? Any takers? Bugger, thought not.

    • Tirk
    • 5 years ago

    Good review. Unfortunately I had to get a monitor before more freesync monitors are released so it’ll have to be next buying cycle before I try out Freesync for myself.

    Luckily I have more time to wait for the 300 series to come out and the 980TI before I have to pick up another gpu so I’ll get to see those reviews before I make a choice 🙂

    • Kretschmer
    • 5 years ago

    This makes me frustrated with Nvidia. I like their drivers and hardware but balk at paying $200 extra for a monitor tied to dead tech.

      • VincentHanna
      • 5 years ago

      This will become a relevant comment, if anyone ever releases a freesync monitor that isn’t also $200 extra.

        • Kretschmer
        • 5 years ago

        The BenQ (1440P, 144Hz, TN, Adaptive Sync) is $630 on Newegg
        The Asus ROG(1440P, 144Hz, TN, Adaptive Sync) is $780 on Newegg

        Is there a $150 qualitative difference in these monitors?

          • anotherengineer
          • 5 years ago

          Don’t know.

          BUT POST 200!!!!!!!!!!!!!!

          • VincentHanna
          • 5 years ago

          So what you are saying is that neither monitor costs ~$300.00?

          Let me know when that changes.

    • Bensam123
    • 5 years ago

    So what’s the difference between adaptive v-sync and freesync?!?!?! Freesync is adaptive v-sync, only we didn’t know it was AMDs doing before assuming adaptive v-sync was vesa’s own thing… And then AMD has the audacity of ‘certifying’ adaptive v-sync monitors as ‘Freesync’ monitors. They’re doing it for free too! Crap, it almost seems as though this is something Nvidia could support… and monitors that work with adaptive v-sync could also be certified as Freesync!

    OMGOMG it’s like Vulcan and Mantle all over again. AMD is invading everything without our consent or knowledge and making us think the opposite! Call the Nvidia raid brigade!

    Joking aside, great article. Freesync is pretty much going to become the norm at this point… erm ‘adaptive v-sync’ that possibly could be Freesync certified. Can’t wait for the FreeG-Sync certified monitors. ^^

    Also hope for the continued evolution of panels. This is definitely a needed injection and hopefully will start pushing the other scaler manufacturers towards including such G-Sync changes if and when they happen.

      • jihadjoe
      • 5 years ago

      I think you’re referring to Adaptive Sync and Freesync.

      Adaptive V-Sync is an older tech Nvidia developed during Kepler which toggles V-sync off and on as the game falls below, or matches/exceeds the panel’s refresh rate respectively.

        • Bensam123
        • 5 years ago

        Shit… Typo ruined the whole thing. You win this one pedant.

    • anotherengineer
    • 5 years ago

    It’s kind of funny how far we have come since the 90’s in technology regarding games. And even with all the graphics, high refresh rates, etc. ChronoTrigger for the Super Nes way back in 1995 still remains one of my top 5 personal favourite games.
    Even with its 16-bit architecture and running on an old CRT TV with probably 320 lines of resolution. I guess there is something to be said for a good quality story line and a nice soundtrack. 144 fps be damned!

      • jts888
      • 5 years ago

      SNES most common modes were actually only 256×224 and 256×239.

      • jihadjoe
      • 5 years ago

      Thanks for reminding me that I still haven’t gotten all the endings.

      Managed to collect all of Crono’s swords though (including the Mop).

    • ronch
    • 5 years ago

    I’ve been using my LG 22″ monitor (W2252TQ) for more than 6 years now and I have no plans to replace it, at least not yet. It’s a TN panel but it’s one of those sort of belongings that you think works really well and has never failed you. If I ever replace it, it’ll be because of either G-sync or Freesync, but personally I think the addition of a dedicated module within the display itself is kinda overkill considering AMD has done practically the same thing without the additional components. Nvidia’s approach MAY have a slight edge, but if Freesync delivers practically the same gaming experience, I’m all for it.

    Maybe I’ll wait a couple more years or so for them to iron out the kinks with Freesync. No rush here.

    • Jigar
    • 5 years ago

    Excellent review, thank you Damage

    • kfleszar
    • 5 years ago

    Let’s take it to the next level.

    This variable refresh is all great, but even if we have monitors capable of refreshing anything between 1Hz and 1kHz, we still have the lack of smoothness due to variable frame rendering time. I mean if one frame takes 10ms to render and is displayed immediately, and another takes 20ms and again is displayed immediately, there is still going to be some choppiness in what I see. After all, one frame is 10ms old when I see it and the other is 20ms old.

    What I would like to see is for the GPUs to do something like a guaranteed 10ms frame rendering time. If the world is too complex, dial down on the details or on something else, but show it to me fast. This would actually make this business of variable refresh rate obsolete.

    Yes, I know this is a tough problem, but this is the problem that should be solved. Just my 2 cents.

      • kfleszar
      • 5 years ago

      Downvotes without feedback. Am I wrong in any way?

        • Flapdrol
        • 5 years ago

        People just like to downvote. Cpu time per frame can fluctuate as well, but unless it fluctuates really badly the result will still look smooth, maybe the motion will look a bit weird.

    • emphy
    • 5 years ago

    [quote<]...a contrived scenario involving a fixed frame rate below 40Hz.[/quote<] I wonder if playing some media files might be a better use case?

    • JustAnEngineer
    • 5 years ago

    Is it bad that I know exactly where in Tyria your low level ranger was standing when you captured those videos?

      • anotherengineer
      • 5 years ago

      Is it worse that I had no clue what game that was??

        • sweatshopking
        • 5 years ago

        no.

          • anotherengineer
          • 5 years ago

          Whew, thank goodness 😉

          edit – hey SSK, do you think Deanjo when you see this??

          [url<]https://www.youtube.com/watch?v=fKP1ojOFgZI[/url<]

      • Damage
      • 5 years ago

      Probably not exactly good, but kinda awesome, right?

    • jihadjoe
    • 5 years ago

    [quote<]The time has come for an awful lot of folks to consider an upgrade. [/quote<] I think it's a terrible time to upgrade. A lot of competing implementations/standards are still in flux and personally I'd wait for the dust to settle down before committing to a monitor purchase which I'd expect to last at least 5 or 7 good years. Buying now would be like buying into an HD format at the start of the HD-DVD and Blu-Ray wars.

      • l33t-g4m3r
      • 5 years ago

      Agreed. Prices are too high, and freesync monitors haven’t fixed their ghosting issues yet. Good to see the review mentioning that this is a panel issue and not freesync itself, because that’s the truth of the matter. Also, video cards that can power resolutions higher than 1080p, are still too expensive. I’d rather wait until prices drop, and the ghosting is fixed.

        • Ninjitsu
        • 5 years ago

        About ghosting:
        [quote<] The final black mark next to FreeSync’s name involves ghosting. I’m actually a little reluctant to call it that since I’m not sure the problem is necessarily the same as the conventional LCD monitor ghosting. But it looks pretty similar and gets the idea across. Essentially, with FreeSync enabled, a shadowy ‘ghost’ version of moving objects can be seen trailing just behind in their wake. Much depends on speed of movement and the colours of both the objects and the background. But as it happens, it’s particularly apparent with AMD’s FreeSync demo involving a 3D-rendered wind turbine. The ghosting that appears behind the blades with FreeSync enabled is as obvious as it is ugly. Critically, it’s not there with FreeSync off, regardless of any other settings including refresh rates or V-sync. [b<]So it’s not an inherent panel problem.[/b<] You can also run the same demo on the Asus G-Sync panel with dynamic refresh enabled and clearly see that Nvidia’s solution doesn’t have the same problem. [/quote<] [url<]http://www.rockpapershotgun.com/2015/04/09/g-sync-or-freesync-amd-nvidia/[/url<]

          • l33t-g4m3r
          • 5 years ago

          [quote<]When I asked AMD's Robert Hallock for his take, he responded with: "In our view, ghosting is one part firmware, one part panel, and zero parts FreeSync." I think that also is a credible statement.[/quote<] It's the panel firmware. They haven't worked the bugs out of variable refresh. I also have a feeling variable refresh requires real time adjustments, which is why NV's solution uses more robust hardware. AMD might have to do some calculations on the video card, but ultimately ghosting is a panel issue, and the reason it ghosts is because freesync was rushed out before fixing the issues associated with variable refresh. [quote<]I asked Nvidia's Tom Petersen about this issue, and he explained that maintaining the correct overdrive voltage needed to reduce ghosting on an LCD with variable refresh rates is a tricky challenge, one that Nvidia worked to overcome in the development of its G-Sync module. [/quote<] In other words, Gsync uses something like variable overdrive voltage, while Freesync doesn't. Is this a problem with Freesync then? No. This is an issue where the freesync panel doesn't support the same overdrive settings as the Gsync panel, and that's up to the manufacturers to overcome. If the panel can't handle those calculations, they need to either beef up the panel's hardware, or somehow have those calculations done on the video card. Either way, ghosting is a panel issue. Think about it for more than two seconds. Video cards don't ghost. Panels do.

            • Ninjitsu
            • 5 years ago

            Thanks for the explanation. Makes a lot of sense. I guess even AdaptiveSync needs more investment when used for gaming purposes.

            I guess it makes sense too: Adaptive refresh rates are for power savings as far as VESA is concerned; they’re more likely to want lower voltages/less switching when possible.

            Sounds like for gaming the panels need the opposite.

      • Cuhulin
      • 5 years ago

      Agreed. However, that may take 2-3 years to settle down.

        • willmore
        • 5 years ago

        Which is good because my wife will kill me if I get a new monitor right now. I still need to suffer through my 2048×1152 display. 😉 #notsuffering

      • jessterman21
      • 5 years ago

      Absolutely – prices are way too high still, and Nvidia’s DSR compels me to stick with 1080p for a good while longer. That’s why I recently bought a cheaper interim monitor, and I’m loving my first VA panel.

      • cygnus1
      • 5 years ago

      And everyone with a Blu-Ray player will say “whatever, buy now”, but those with an HD-DVD player will very much agree with you. I guess we need to find out which standard porn companies are going to use, because apparently that industries choice has decided our media format winners. VHS over BetaMAX, DVD over whatever other option there was, and Blu-Ray over HD-DVD. All the winners picked/supported by companies selling porn.

      • tootercomputer
      • 5 years ago

      I was going to post the same quote, but from a somewhat different tact. I would always like to upgrade various parts of my systems, but I have lots of other responsibilities and costs in life that take priority. That said, I thought I was doing pretty good by finally managing to get my first IPS monitor a month ago.

    • Mo
    • 5 years ago

    Any indication if this screen has the same “inversion” issues that the PG278Q has? The review didn’t mention anything, but I didn’t see anything about it on the PG278Q review either.

    The effect I’m talking about is vertical lines on objects that move or flicker quickly, especially with bright objects on dark backgrounds. The [url=http://www.tftcentral.co.uk/images/asus_rog_swift_pg278q/response_comparison.jpg<]overdrive image[/url<] from TFTCentral's review shows the effect somewhat to give an idea what to look for. I don't know if the actual problem is caused by the LCD inversion process, but the problem is widespread on the PG278Q and that's the name problem's most commonly associated with online. I'd love to see a TR investigation of the effect on the PG278Q, actually, since you've got access to high FPS cameras and tend to go in-depth on this stuff. Great job on the investigation of the Freesync low refresh threshold, btw. Their solution seems "good enough" for their hardware-light approach, even if it's not perfect.

    • Firestarter
    • 5 years ago

    Thanks for looking into the issue of low framerates! I was afraid that AMD hadn’t managed to do it right the first time, but I’m glad to see I was wrong. I really want this monitor now and I agree that NVidia and Intel should jump on the displayport 1.2a wagon.

    [quote<]BenQ's blur-reduction mode is not compatible with variable refresh rates, so it's probably more of a curiosity than anything[/quote<] I have seen the effects of a strobe back-light with my own 120hz monitor, and the effect is certainly [i<]not[/i<] a curiosity! It greatly improves our eyes ability to track motion, like when simply looking around in an FPS. If a system is capable of hitting your desired frame-/refreshrate and you're looking for every edge in competitive play, I think a strobe back-light is a must and definitely worth the trade-off of not having variable sync, because you won't need variable sync anyway.

      • Chrispy_
      • 5 years ago

      You’re missing the point. It’s a curiosity on a variable refresh display because you’re clearly paying the premium for variable refresh.

      Why would anyone go to the effort of buying a matched graphics card and premium display for variable refresh, to then ignore those requirements and disable the variable refresh for something altogether different.

      Scott is not saying that strobing backlights are bad, he’s just saying that they’re not the focus of a variable-refresh display, hence a curiosity. It’s like buying an aeroplane to go faster than a car [i<]on the ground[/i<]. Sure, the aeroplane [b<]*can*[/b<] reach higher speeds than most cars on the ground, but that's completely missing the point; You don't buy an aeroplane to travel fast [i<]on the ground[/i<]....

        • Firestarter
        • 5 years ago

        sometimes you play Crysis, and sometimes CS:GO, this monitor can do both extremely well and dismissing low persistence mode as a curiosity is dismissing the needs of a group of hardcore gamers

        but who am I kidding, I’m not going to argue with you because you will win by sheer persistence

        • Prestige Worldwide
        • 5 years ago

        Another day, another car analogy in a tech site comment.

      • Prestige Worldwide
      • 5 years ago

      I did think ULMB was a nice feature, but the brightness was way too low for me to continue using it on my XL2420T 120Hz screen.

        • Firestarter
        • 5 years ago

        same problem here, plus that my Samsung S23A700 doesn’t actually officially support it so it really messes up the colors too. That said, the increased clarity convinced me that I need it for my next monitor

          • Prestige Worldwide
          • 5 years ago

          I had to enable it with the Strobelight program from Blurbusters, so I suppose mine also didn’t officially support it.

          But from the sound of many reviews (including this one), official support for ULMB is still too dim. It would be nice if they could tweak the backlight for brighter strobing.

    • Flapdrol
    • 5 years ago

    [quote<]Yes, at the end of the day, the G-Sync-infused Asus ROG Swift PG278Q is slightly superior to the XL2730Z in certain corner cases. But I wouldn't hesitate to recommend the XL2730Z, especially since it costs less than the Asus.[/quote<] A whopping 9 euro's less, on a 700 euro monitor.

      • Chrispy_
      • 5 years ago

      £107 price difference in the UK when looking at the prices with tax.
      That’s a €148 difference. Your resellers are clearly just greedy profiteers.

        • Firestarter
        • 5 years ago

        €10 difference here in Germany, I don’t know what’s up

        edit: €14 in the Netherlands

        edit: my edit borked the preview?

      • TO11MTM
      • 5 years ago

      In the US It’s a 150$ difference, at least shopping around my typical retailers.

        • Flapdrol
        • 5 years ago

        I guess benq loves their region pricing.

          • TO11MTM
          • 4 years ago

          Probably. In my region of the US BenQ tends to be considered not as good as Asus, Samsung, or even Dell. Part of it is probably the models offered here. Lots of lower end stuff. Another part of it is the ancestry from Acer, which has had a typically mediocre reputation in the US.

    • puppetworx
    • 5 years ago

    Very interesting review. After seeing lots of ghosting complaints from other reviewers I looked around on some forums and found lots of people saying the ghosting was hardly noticeable and disappeared almost entirely if Advanced Motion Acceleration (overdrive) is turned up. This gels with Scott’s account as does the poor calibration and ‘weird’ settings from the BenQ factory.

    I’ll wait for the Asus MQ279Q launch but after this review I’ve added the BenQ XL2730Z back onto my shortlist.

    • weaktoss
    • 5 years ago

    Could Nvidia’s reluctance to support Adaptive Sync have to do with not wanting to piss off its display hardware partners? I’d imagine manufacturers who have brought G-Sync monitors to market might be a bit peeved if Nvidia decided to support a standard that effectively obsoletes those monitors.

    • cygnus1
    • 5 years ago

    [quote<] over two thirds of our readers are rocking a main desktop monitor resolution of 1920x1200 or lower. [/quote<] I have wanted to upgrade my monitor for about 2 years now. I am rocking a 1920x1200 24" monitor that's nearing a decade in age. I paid a decent amount of money for it, over $600 I believe. I paid that much because it's a 16x10 monitor with a USB hub and lots of video inputs, it has HDMI, VGA, DVI, and even component input. It was fairly future proof for the time. I'm willing to spend another good sized chunk of money on a new display... But it needs to be fairly future proof. I'd probably go as high as $1k for a display that I thought would come close to lasting me over 5 years. However, we've had so many advances and competing monitor techs come out in the last few years, it's hard to pull a trigger on anything. I haven't seen a single display come out yet that fits the bill. Nor has any vendor talked about bringing together all the right techs in one device. Not helping matters is nVidia and AMD. They need to figure out some adaptive refresh rate standard that they'll both support. Even though I do have a preference for GSync, proprietary is bad and right now that means nVidia is the one causing the problem. The other way for that problem to be solved is for a display manufacturer to come out with a display that will work with both techs. Here's what I think should be a no-brainer, sell lots of, monitor spec list: -Good color, good viewing angle, non-TN panel (IPS or AHVA or whatever) -30hz to over 100Hz adaptive refresh rate capable with AMD and nVidia cards -1440p or 1600p depending on the aspect ratio -16x10 27/28" or 21x9 31-34" -HDMI 2.0a (would've said 2.0, but that line got moved again) -DP 1.3 (might have to be 1.4 if that gets finished soon and it actually adds anything useful) -USB 3.1 gen2 hub with multiple type c ports with support for full Power Delivery -Maybe even have it's power port be a USB type C port as well -basic 2.0 speakers that can decode the output from the HDMI or DP audio I feel like I'm forgetting something, but Dell, Asus, Acer, Benq, whoever, feel free to throw in other features as well. Edit: and when I say future proof, I do mean GPU agnostic as well. All it's features should work with any GPU that supports any version of that feature.

      • Meadows
      • 5 years ago

      Why insist on non-TN?
      TN continues to have the advantage in fast and/or sudden screen updates.

        • sweatshopking
        • 5 years ago

        cause it has crap viewing angles and looks like trash. it is faster, but it’s not nearly as nice to look at.

          • Luminair
          • 5 years ago

          you speak very authoritatively. it’s your word versus scott’s now. let’s just put it out there: you’ve never seen this monitor before, so what you’re saying about it is worthless. right?

            • sweatshopking
            • 5 years ago

            I’M NOT SURE IF YOU SEE WHO YOU’RE TALKING TO. I’LL GIVE YOU A FEW MINUTES TO THINK ABOUT IT AND APOLOGIZE.

          • auxy
          • 5 years ago

          Anti-TN noobs… ( *´艸`)

            • sweatshopking
            • 5 years ago

            Some people just like quality. What can i say? i’m a king.

            • Terra_Nocuus
            • 5 years ago

            it’s good to be the king

            • anotherengineer
            • 5 years ago

            [url<]https://s-media-cache-ak0.pinimg.com/236x/f4/6e/2f/f46e2fd952d793e6e785fe7f113bfa3d.jpg[/url<] or this? [url<]http://i.imgur.com/vA2GyED.png[/url<]

            • jihadjoe
            • 5 years ago

            Now that 144Hz IPS panels are finally upon us there’s absolutely no reason not to be an anti-TN noob.

          • Meadows
          • 5 years ago

          Have you read the review? Like, at all?

          • UnfriendlyFire
          • 5 years ago

          It’s possible to have a good quality TN display that can match ISP. The problem is that it’s easier to make a cheap TN than cheap IPS.

        • cygnus1
        • 5 years ago

        What SSK said. TN can DIAF. Especially for bigger screens or mobile devices where there is a larger chance that all or some part of the screen will be at an off angle.

        Recent advances in the non-TN displays have allowed for higher refresh rates. I don’t need it to be 144Hz necessarily, but over 100Hz is what I personally want. I don’t really know where 144Hz came from as a target, seems arbitrary to me. 120Hz used to be the target, but I guess that wasn’t fast enough for some folks.

          • l33t-g4m3r
          • 5 years ago

          Remember CRT’s? 72hz was kinda the minimum to avoid flicker. This makes sense for 3d.

            • cygnus1
            • 5 years ago

            Ahhh, so this is a hold over to CRTs… Screw 144 then. 120 will be fine for me. The difference between 120 and 144 aren’t big enough to make me pay any kind of premium for one over the other.

            Unless, you know, they want to go nuts and go to 240 or something stupid crazy like that. And not the 240 like a TV does 240, real 240. I’d love to see what tricks it would take to get 4K or even 1080p running 240Hz over a standard cable.

            • l33t-g4m3r
            • 5 years ago

            It’s not fine if you’re using 3d shutter glasses. That’s why they upped the refresh.

            If you’re not using 3d, then of course 120 isn’t a problem, but the point of 144hz panels was originally for 3d. 120hz with shutter glasses would essentially be the same as a CRT with a 60hz refresh rate, which is the real reason why we have 144hz.

            • cygnus1
            • 5 years ago

            Ok that makes sense too, didn’t consider that 3D shutters would cut the refresh rate in half essentially. But I still don’t see why they just didn’t go for 150 then. It’s a more even number and not a large percentage more than the 144. Eh, whatever though, I’m saving 3D for Oculus type VR headsets. I think those will really take off with that 60Ghz wireless tech so all you need is a headset with a display and a battery and wifi chip. God forbid the battery in it overheats though….

          • Meadows
          • 5 years ago

          144 Hz comes from needing the ability to play back 24 fps movies without frame pacing artifacts. 144 is the next step after 120 for that.

            • cygnus1
            • 5 years ago

            Ahh, that makes perfect sense. One of those facts that after it’s been explained seems really obvious.

            I do wonder though, why that would be a concern for computer displays. Yes, 120Hz is 5 x 24, and 144 is 6 x 24, but no other common refresh rates (I’m thinking 60Hz, 75Hz, etc) are even multiples of 24. I feel like have seen 96Hz as an option on some devices at some point, but can’t recall exactly where. So I may be remembering wrong. On TVs they’ve gone with 120Hz and 240Hz (unless you count the goofball stuff done on plasmas as real refresh rates). Not that any TVs can actually accept either of those refresh rates as input, but at least that’s what their “motion engines” supposedly upscale to and output.

            Still, while multiple of 24 does make sense as an explanation, I still don’t see why they would bother. Maybe just as a one up on other manufacturers? To me, 5 or 6 x 24 doesn’t seem like it would make a difference either way, and 144 is not an even multiple of 29.96 or 30, so 120 seems better than 144 as far as movies and TV go.

            • Meadows
            • 5 years ago

            Actually, 60 Hz is the one that’s a “holdover” from the CRT era. Still, it’s not completely terrible, because — as you may have noticed — it’s exactly half of 120, so unless you use vsync with it, movie frames will sync up with it every 4 frames (once per 167 ms). Whether that’s noticeable depends on what you’re watching.

            I don’t remember 96 Hz myself.

            Anyway, 120 Hz is a pretty sweet spot, and quite future-proof no matter which way you look at it. I wouldn’t personally need 144.

            I’m not sure of the reason why progress seems capped at 144 Hz rather than, say, 168 or 192 Hz or something else. At first I thought it might have something to do with bandwidth (seeing how 144 Hz is already too much for DVI, even at regular 1080p) but that doesn’t seem likely because DisplayPort is the current standard and it has several times the bandwidth of DVI.
            It might also have something to do with the switching speed of LCD pixels. And then there’s the question of content: strictly speaking, there’s no content that fully utilises either 144 *or* 120 Hz. In other words, “mere” 120 Hz ought to be enough for everyone for the next several years.

            And so we finally arrive at marketing. Essentially, 144 Hz is “120 Hz but one-upped”. So there you have it.

            • l33t-g4m3r
            • 5 years ago

            Except that you can see blinking on a 60hz CRT, and 120hz with shutter glasses would essentially be the same. 144hz is necessary for shutter glasses, so you don’t see the blinking.

            • Meadows
            • 5 years ago

            I doubt 3D is really part of the equation, because NVidia’s solution is the only game in town right now and if this panel had the green blessing then every outlet reporting on it would’ve mentioned that bullet point already.

            At least in the case of this particular display.

            • l33t-g4m3r
            • 5 years ago

            No, it is the ONLY part of the equation. Being a Freesync monitor however, means NV has nothing to do with its 3d. Tridef would be the AMD solution, and its 3rd party software/hardware. AMD doesn’t sponsor 3d or monitors like NV does, as it’s all outsourced.

            Think about it for longer than two seconds, and reality might set in.

            120/60hz shutter glasses = blinkfest
            144/72hz shutter glasses = acceptable

            That’s the reality of 144hz. NV may have started it, BUT THIS ISN’T A NV MONITOR. That doesn’t change the origins though. 144hz originated from 3d, and that’s the new standard whether or not you personally use 3d.

            Saying 144hz isn’t part of the equation is like NV saying dx11 isn’t part of the equation, because their dx10 cards do physx. People who know better are like, “what are you smoking?”.

            Monitor refresh rates are not random or arbitrary. Same with video frame rates. There are standards for a reason. You don’t have to understand why, but it is a standard, and that’s pretty obvious since everyone uses 144hz.

      • yokem55
      • 5 years ago

      I’m in exactly your boat. I’m rocking a 7 year old 23″ 1680×1050 Samsung and the power board has been failing for a few months now. It takes about five minutes for it to warm up to the point where it doesn’t give me a seizure. I’ve been hoping the sweet spot of Gsync, 4K, high refresh rates (doesn’t need to be 144, but much more than 60) and high quality colors would come around soon so I could replace it. It looks like the tech I’m looking for is still a year plus off.

      I’ve tracked down and ordered a new power board on Ebay instead, and if that doesn’t work I’ll pick up whatever $100 will buy me on the next deal of the week. Sigh…

        • Huhuh
        • 5 years ago

        I have an old samsung 226bw which started having problems turning on. I changed a few bad capacitors and its working wonderfully ever since. Pretty easy fix for under 5$. Hopefully it will last a few more years before I have to upgrade.

          • yokem55
          • 5 years ago

          Same model for me. I figured the extra cost of the full board is well worth not having to do the soldering myself though…

      • the
      • 5 years ago

      I mostly agree with your list. Though I don’t believe supplying and providing power for devices simultaneously is a good idea. Something like a monitor is expected to be stationary so having an integrated AC adapter and standard power cable isn’t a bad thing, especially since a monitor is something logically that could provide the power to charge a laptop.

      You’re also want a couple of USB Type A for standard keyboard/mouse/thumb drives that are common place today.

      4k and 5k resolution would be nice, even if gaming has to be done at a lower resolution. Just need a good, fast scaler so that integer based resolution scaling isn’t bad. (IE gaming at 1920 x 1080 on a 3840 x 2160 panel).

        • cygnus1
        • 5 years ago

        That’s the beauty of the USB power delivery, it’s negotiable in both directions and power can come or go from any port. 50W or so isn’t too much to ask for a video card or motherboard to passively provide if it’s already hooked directly to the PSU.

        It would also be nice for in the future to have the monitor be the power supply for a laptop that’s outputting back to it on the same cable.

        Think future proof.

          • the
          • 5 years ago

          I think my disdain stems from the concept of having a monitor solely reliant on receiving power over a Type-C connector. I don’t see this being practical for most. However, the more I think about this idea, it’d be awesome as an option. Need to demo something from a beefy laptop? Be able to run an external monitor for your half hour presentation could be useful.

          I do see monitors as an ideal place to integrate a laptop charger which would necessitate regular AC power.

            • cygnus1
            • 5 years ago

            That’s exactly it, flexibility. Let it receive input and send output over the Type C ports. You could even end up with your next video card having a built in USB controller so it can combine its display port video output signal and USB data AND power out over one small reversible cable to your monitor! That’d be pretty sweet in my book.

      • Cuhulin
      • 5 years ago

      The one thought I would have about your specs is the USB hub. I used to use what I suspect is the same monitor as you, based on the specs you list, but recently, I’ve simply gone to a separate USB hub, and I heartily recommend doing that. As the tech changes, the hub can be changed without changing the rest of the monitor.

        • cygnus1
        • 5 years ago

        Gateway fhd2400?

          • Cuhulin
          • 5 years ago

          Yup! I liked the stand portion enough that when I first went to multiple monitors, I purchased two more of those stands for the other monitors and also bought the speaker bar.

          It’s a pity that Gateway got bought out. They were a fine company.

    • odizzido
    • 5 years ago

    I am glad this is working so well. It definitely makes AMD products more desirable.

    • Westbrook348
    • 5 years ago

    Slightly OT, but my 30″ 1600p IPS Korean monitor died and I need advice. I think the panel itself still works perfectly, but something went out in the board. For months, the screen would display vertical lines of various bright colors when first turned on, then fade to white until I turned the thing off. Cycling the monitor on and off 2-5 times would usually get the screen to finally start normally, which is why I think the (beautiful IPS) panel is still fine. The circuitry just acts dead.

    The chances of getting an intact new board from Korea are pretty slim, right? I just don’t think it’s time to throw it in the trash.. I paid $500 for it and still think it can be repaired; if not, it should still be worth something, shouldn’t it, since 99% of the parts are probably in working order? Maybe I should’ve taken it in to get checked out while it would still occasionally work, but I don’t actually know where to take it or send it. I live in northern Indiana. What would you guys do in my situation? Thanks

      • odizzido
      • 5 years ago

      I would open it up and check out the capacitors. I’d say there is a decent chance that one of them is going and they’re fairly easy and cheap to replace.

        • Westbrook348
        • 5 years ago

        Thanks for the response! I wouldn’t know what to look for if I opened it. Are failing capacitors obvious?

          • sweatshopking
          • 5 years ago

          swollen, leaking, etc.

          • Sam125
          • 5 years ago

          If I were you, I wouldn’t even muck around with trying to fix it but instead contact the seller and see if he/she’ll give you or at least sell you another electronics board. Ebay sellers tend to care about their reputations so maybe they’ll be obliging. Just a thought.

      • sweatshopking
      • 5 years ago

      I’d definitely get it checked out. How old is it?

        • Westbrook348
        • 5 years ago

        Bought it brand new on ebay and it worked perfectly for about 4 months and then had the vertical lines on start up for another 6 months or so. I have no idea where I would take the monitor to get it repaired.

          • sweatshopking
          • 5 years ago

          they usually have a 1 year warranty. I’d definitely contact the guys you bought it from. nobody local repairs TV’s? i live in teh middle of nowhere in Nova Scotia and there is a few guize around. maybe put an ad on kijiji or craigslist?

      • albundy
      • 5 years ago

      cheap chineee capacitors probably leaked. cutting corners is the norm these days. you gotta crack it open and check for busted and leaking or bubble-at-the-top capacitors. start with the power board and then the inverter board (some monitors have both on the same board). if it’s not that, then get a multimeter and check the diodes for power leaks. they should show readings going only one way. also, if you are using an external power adapter, you might want to consider trying another one.

      edit: also check the screen’s back panel connector. alot of times the tape that holds them gets loose for whatever reason. (this caused abnormal vertical lines on my old sceptre lcd).

    • derFunkenstein
    • 5 years ago

    [quote<]Nvidia has indicated that it intends to continue the development of G-Sync, perhaps adding new capabilities beyond variable refresh. Some of the possibilities—like eliminating the left-to-right, top-to-bottom LCD panel refresh pattern or interpolating between frames during long intervals a la Carmack's time warp—could improve gaming displays even further.[/quote<] Mulling this bit over in my head I came to a realization: nVidia has stumbled upon the PC Gaming audiophile equivalent. Spending hundreds or thousands of dollars on speaker wire, USB cables, and diamond-tipped turntable styluses, PC Gamingophiles will spend hundreds of extra dollars on special monitors that will be randomize drawing patterns so they will think it's more "lifelike"

      • sweatshopking
      • 5 years ago

      yes.

      • psuedonymous
      • 5 years ago

      Not a great comparison. You can do objective testing, and double-blind testing, and demonstrate there is no difference between a super-OFC-silver-whatever £1000 cable and a coathanger. But you can hook an oscilloscope up to a photodiode, or point a high speed camera, at a G-sync and DP Adaptive Sync display, and easily measure the difference.

        • derFunkenstein
        • 5 years ago

        I’m not talking about variable refresh (which I really do want…someday), I’m talking about all the other stuff nVidia is apparently doing, based on the quoted text.

          • superjawes
          • 5 years ago

          I still disagree, depending on the implementation. It could mean that Nvidia apply the same variable refresh to each individual pixel via some differebtial logic. That would provide a measurably positive effect.

          It’s obviously still in the fancy, words-only concept phase, but I won’t discourage Nvidia from exploring it.

          And on the audiophile cable thing, that schtuff is 100% certified voodoo.

            • derFunkenstein
            • 5 years ago

            It’d for sure be the death of TN panels, because if you only refresh the pixels you need the others will eventually turn white. And by “eventually” I of course means in about 45ms or so.

          • psuedonymous
          • 5 years ago

          As am I.
          You can hook a photodiode up to a ‘scope, tape it to a display, and watch the refresh occur. A G-Sync panel will start to refresh-double (and triple, and quadruple) below the maximum-frame-time threshold of the panel when driven at update rates below that refresh rate. The DP Adaptive Sync panel will continue to refresh at the minimum rate. Frames will either arrive with tearing, or with jutter (whole-frame only updating), depending on what behaviour Freesync is set to. The benefit to the frame-doubling is that frames below the minimum threshold STILL arrive as they are rendered; instead of a frame arriving and being displayed for X ms, it arrives, is displayed for 1/2 X ms and then displayed again for 1/2 X ms.

          This isn’t speculation, this is observable behaviour. You can independently verify it with objective tools (i.e. electronic equipment, not the often fallible human visual system).

          The frame-doubling behaviour can be replicated by Freesync by the GPU sending frames multiple times (rather than G-sync, which buffers frames and has the controller do the worl monitor-side).

            • derFunkenstein
            • 5 years ago

            and in 24/96 audio, you can look at the waveform and compare it to 16/44.1 and see a difference. But is it something that you can feel in the experience? Or just a placebo?

            • superjawes
            • 5 years ago

            Um, I don’t think the waveforms would bw different. Once you convert to analog (time domain), I think they will be virtually identical, and any difference would be irrelevant.

            First: frequency and sampling rates. In order to reconstruct a signal without aliasing, your sampling frequency needs to at least be double the greatest frequency of the signal. Human hearing drops off at 20 kHz, so 44.1 (kHz) is more than enough (higher will only be useful for inaudible signals).

            Second: bit depth. This ends up storing amplitude information. There might be more details I am skippig, but the short story is noise. Greater bit depth gives you a lower noise floor, so increasing this won’t necessarily be visible if you look at the signal on an O-scope.

            • derFunkenstein
            • 5 years ago

            Actually I was talking about looking at the waveform in a DAW like Pro Tools or the like. And thanks to dithering (noise), if you zoom way in there is a visible difference. But can you hear it? I can’t. And I record everything at 24/96 solely because I Can Do It.

            • superjawes
            • 5 years ago

            Well that’s kind of my point…any difference you “see” is confirmed to be outside the range of human hearing, so you would not be able to hear it.

            • derFunkenstein
            • 5 years ago

            Right, and what I was trying to say to pseudonymous is that if you have to put a monitor under an oscilloscope to see the difference, can you really see it?

            • superjawes
            • 5 years ago

            Gotcha…but still, that isn’t exactly a good comparison. The range of human hearing is pretty well known, while visual perception is less so, at least when it comes to movement. In fact, I’m not entirely sure how you would quantify conversion between time and discrete domains when it comes to visuals, and turning that into a human perception metric is whole nother issue.

      • Lans
      • 5 years ago

      I don’t really know how monitors works but it seems to me that the left to right, top to bottom thing could be outside the G-Sync module and could work with FreeSync as well.

      I am mixed about interpolating (or rather my assumption of how it would work). I think ideally it would smooth transitions between frames but it seems like it would introduce delay of 1 frame?

      • Krogoth
      • 5 years ago

      Videophile market has been around for a long time.

      G-Sync/FreeSync is just latest “fad” in the videophile market replacing the whole TN versus IPS panel flamewar.

    • _Ian_
    • 5 years ago

    I’m not sure I understand why 30 fps isn’t as smooth using FreeSync as G Sync.

    Given the max time the monitor can display a frame for without refreshing is 25ms (40fps), lets examine what happens for 30 fps content.

    Frame 0 is displayed at time 0ms. Frame 1 will be ready at 33ms (30 fps). The monitor has to refresh at 25ms to prevent pixels losing state. Then, because the monitor can run at 144hz, it can display a new frame again 6.9ms after that, at 31.9ms. Then frame 1 is ready at 33ms, and the monitor should be able to display it with no delay, no stutter (obviously this only applies to 30 fps, and 31 – 39 fps do cause stuttering, but it was 30 fps used in the example).

    The duplicate frame should never collide with the new frame at 30fps. What am I missing?

      • _ppi
      • 5 years ago

      The issue was mostly spotted on LG screens first reviewed by PCPer, that had 48-75 Hz range, that is rather unfortunately narrow.

      I doubt anyone would notice sub-40 fps stutters with 144Hz screen, but looking at PCPer analysis, AMD probably could tweak their drivers a bit to pre-refersh a bit preemptively.

      The oscilloscope test PCPer did was with vsync off, that might have thrown results off a bit. I wonder what would it be with vsync on. And it was also performed at constant fps, therefore quick fps swings could not be analysed. It was the best attempt to dissect what happens, that I have see on various tech sites, though.

        • _Ian_
        • 5 years ago

        I’m specifically talking about the example in this article though, the swinging pendulum demo on page 3, which is locked to 30 fps.

        You can see in the video, and Scott points out, the g sync display is slightly smoother. I don’t understand why though, for reasons outlined above.

          • Firestarter
          • 5 years ago

          The maximum rate of the monitor is one frame every ~7ms, the minimum one every 25ms. So, after waiting for a frame for 25ms, the GPU must send the old frame again, which presumably takes 7ms. 25 plus 7 equals 32ms, which is really close to the 33ms interval between frames when the frame rate is 30 fps. I guess that 30 fps is just about the lower bound at which these collisions happen.

          I’d love to see AMD copying Nvidia in this regard, if at all possible, but it seems that with this kind of monitor it’s not as big of a deal as I thought it might be. By the way, that pendulum demo looks worse for me at 30 fps than it does at 60 fps (which makes sense).

            • _Ian_
            • 5 years ago

            I see what you’re saying, it is close, but that ~1.4ms is a lot of time in relative terms when a full frame only takes ~6.9ms. There [b]is[\b] enough time to use the new frame at 30 fps.

            If there was some additional inter frame delay the pushed it past 33.3ms, then how could the monitor ever do full 144 fps?

            The most likely answer I have is the demo was locked closer to say 32 fps or something, but really, I have no good answer, which is why I asked.

    • DPete27
    • 5 years ago

    [quote<]If you tune your image quality settings right, the vast majority of PC gaming should happen between 40 and 144Hz, not below the 40Hz threshold[/quote<] Key sentence for the TL;DR crowd.

      • willmore
      • 5 years ago

      But, but, but, but, the sliders go got 11. That’s *more*.

    • dragontamer5788
    • 5 years ago

    Thanks for the excellent review!

    I’ve been more interested in the LG 29UM67 panel (IPS, ultra-wide, freesync). I know that you guys don’t normally do tons and tons of monitor reviews, but since the LG panels offer something different… maybe yall can do a review?

    In either case, its great to see an excellent, in-depth review on these new “FreeSync” monitors. Even if it isn’t one I was particularly interested in.

      • sweatshopking
      • 5 years ago

      i should also say that I really appreciated the review. Don’t think, scott, baby, that i’m disagreeing or disappointed in that regard.

      • Kretschmer
      • 5 years ago

      From what I’ve read the LG panels have a problematic refresh rate spread (from 48hz to 74hz). If your FPS is between 37 (74hz/2) and 47hz, you lose the effects of adaptive sync. So they’re nice panels but not optimal for gaming.

        • dragontamer5788
        • 5 years ago

        [quote<]From what I've read the LG panels have a problematic refresh rate spread (from 48hz to 74hz). If your FPS is between 37 (74hz/2) and 47hz, you lose the effects of adaptive sync.[/quote<] That sounds more like how G-Sync handles things. It sounds like the effects of sub-48Hz is going to be handled by the 13ms "frame repeat" (1/75Hz) that was talked about in this review. However, it would be best if there were some independent confirmation of this. [quote<]So they're nice panels but not optimal for gaming.[/quote<] Indeed. Clearly the 144Hz Panels are best for gaming. But I also do some video editing. So the 99%+ sRGB coverage, deeper contrast and such (the LG FreeSync monitors are IPS) give me some utility outside of video games. If the LG Panels are "good enough" for video gaming (low display lag, good enough response times), I'm going to purchase it. But I'm waiting for a solid, comprehensive review from somebody. (G | Free)Sync AND IPS is ONLY being offered by LG. There are a ton of high-speed TN panels on the marketplace. Furthermore, the [url=http://www.adorama.com/LOT29UM67.html?gclid=CJDfhpW19MQCFQIQ7AodvmwAFw<]29-inch LG Panel is only $450[/url<], making it the cheapest FreeSync monitor on the market right now.

          • _Ian_
          • 5 years ago

          It’s not just LG, Acer and Asus have IPS G Sync and FreeSync monitors coming out respectively, and they’re both 144Hz to boot. They’re a more standard 27″ 16:9, but won’t be cheap though.

          [url<]http://www.tftcentral.co.uk/reviews/acer_xb270hu.htm[/url<] [url<]https://techreport.com/news/28081/asus-mg279q-display-hops-on-the-freesync-wagon[/url<]

    • ptsant
    • 5 years ago

    Nice review. If you say FreeSync is good enough, I feel confident buying it. Now let’s see some more monitors to choose from. I could offer myself a 390 and a monitor if I get a raise this summer…

      • sweatshopking
      • 5 years ago

      You’re actually going to be laid off this summer. I’d save your money. you’re going to need it.

        • cygnus1
        • 5 years ago

        That’s just not a funny joke.

    • Meadows
    • 5 years ago

    That “more than two thirds of your readers” part is actually around three quarters, if I remember correctly. Or almost. Not that it makes much of a difference.

    As for your call to arms about upgrading monitors: I understand the awe you must be in with regard to these new displays, but you, as a journalist swimming in various gear, might wish to take a step back every now and then and realise that those three quarters of your readers might not have what it takes to fully *utilise* any of the new monitors.

    Essentially, those readers all suffer from the same thing that ails sweatshopking, including yours truly. By denouncing 1080p, you make it sound as if three quarters of your readers were beneath your standards; either that, or you’re making assumptions regarding their apathy on the matter, when for them it’s really just a catch-22 between GPU power and money to burn.

      • anotherengineer
      • 5 years ago

      Even the “gamer” steam survey [url<]http://store.steampowered.com/hwsurvey/[/url<] shows 1366 x 768 - 26.64% 1920 x 1080 - 34.08% 2560 x 1440 - 1.09% and almost 20% graphics by intel lol, maybe it's time for intel to use adaptive sync so games can look buttery smooth on 1366x768 lol edit - agree/side with Meadows and get down thumbs. Thx Meadows............. edit 2 - Well the main thing is Damage's conclusion " you don't exactly notice variable refresh when it's working well, because it's silky smooth and seamless" is that AMD's freesync works as expected and works well. At least that is the main point to me.

        • Meadows
        • 5 years ago

        People don’t seem to like the things I say, but nobody’s actually come out to disagree with me.

          • sweatshopking
          • 5 years ago

          i’m actually blown away i’m not -20 yet. I have been every time i’ve voiced this opinion in the past. end user and deanjo must not have read the review yet.

            • End User
            • 5 years ago

            I’m here now. Job done.

          • anotherengineer
          • 5 years ago

          It’s not the things you say, it’s just you 😉

          j/k

          Edit – I mean, who wants to agree with a field of grass!! 😉

          Edit 2 – I don’t usually up or down vote anyone, I kind of find that childish, I will usually add a statement instead.

            • sweatshopking
            • 5 years ago

            I’m not sure it is a j/k. some of us long time boyzzzzz and gurlzzzzz have relationships and appreciation for each other due to length of time and just general interaction. I know Glorious used to strongly dislike me, but now we’re basically soul mates. He grew to appreciate me, even if i’m a PITA, and vice versa. I appreciate and enjoy meadows, but we’ve been talking for 7 years or so.
            some of these new guize don’t have the history, and just – because they disagree.

            • anotherengineer
            • 5 years ago

            I think Deanjo is going to down this 😉

            • sweatshopking
            • 5 years ago

            deanjo, maybe. EU DEFINITELY.

            • End User
            • 5 years ago

            -3 never felt so good

            • auxy
            • 5 years ago

            I hate most of you. (*‘∀‘)

            • sweatshopking
            • 5 years ago

            yes?

            • JustAnEngineer
            • 5 years ago

            Give in to your anger…
            [url<]https://www.youtube.com/watch?v=PFkAAvDkj9k[/url<]

            • End User
            • 5 years ago

            +3 for being honest

            • Meadows
            • 5 years ago

            Voting is fine. I like it. Do vote, people.

            Here’s my internet interpretation of voting matters, simple as pie:
            – If I get a hundred upvotes, that means I’m right because people agree with me.
            – If I get a hundred downvotes, that still means I’m right because people just can’t handle the truth and I get to be an outspoken savant.

            • anotherengineer
            • 5 years ago

            lolirl

            edit – I wish TR would put all the people who up’d and down’d the comment in the bottom of the comment box like they do here
            [url<]http://www.snbforums.com/threads/asuswrt-merlin-378-52-is-available.23738/[/url<] and here [url<]http://www.techpowerup.com/forums/threads/how-to-reinstall-video-drivers-quick-guide.52502/[/url<] I think that would be interesting. It would make it easier for SSK to work those down-votes knowing peoples personal opinions on things 😉

            • sweatshopking
            • 5 years ago

            i’d LOVE to know that information.

          • End User
          • 5 years ago

          I disagree with you. Stop being a total wanker. This is an enthusiast site. Awesome shit is going to get reviewed. Most of which will be better than what you or I have. That is the fucking point!

        • derFunkenstein
        • 5 years ago

        Those people playing at 1366×768 on their Inte HD 2500 systems aren’t reading TR, either.

          • sweatshopking
          • 5 years ago

          I think many of them are. Those systems DOMINATE the developing world. many people in eastern europe, asia, etc. who read here are likely on those systems.

            • derFunkenstein
            • 5 years ago

            I don’t think the developing world is reading TR. A few examples, sure, but not en masse, despite the weird forum spam.

            • sweatshopking
            • 5 years ago

            Croatia? Romania? I bet more are than you’d think. Only scott would know, but I’d expect it’s a decent sized chunk of the visitors. not 30%, but definitely more than 1-2, i’d think.

            • derFunkenstein
            • 5 years ago

            Oh, sure, you might be talking about 5-10% of TR’s audience (which in the registered commenters seems to be almost entirely US/Canada). But even then, I think at least half of THOSE 5-10% are playing on something other than Intel IGP.

            There are some people here and elsewhere that have notebooks with HD4000 or HD5000 that are playing games on the road in Steam, and those results count equally with everything else. I think that’s probably why the number is high. Just the mere presence of the hardware means that another gets tallied for Intel. And in my system, the HD4000 on my 3570K is enabled so it can be used in QuickSync. Is that being counted? Is it being counted with systems using Optimus?

          • willmore
          • 5 years ago

          Hmm, I’m reading this on my craptop which has only “Intel HD” graphics in its SB era Pentium processor with the horribly inexpensive 1366×768 TN display. I don’t game on it, but if it had VRR, I might.

            • derFunkenstein
            • 5 years ago

            You don’t need VRR because it already only runs at a rock solid 15fps. :p

            • willmore
            • 5 years ago

            I think you might be mocking me.

      • Damage
      • 5 years ago

      Man, I said in the intro I may get a little too excited about these things. And I said folks may want to consider upgrades. You’re free not to, and perhaps that statement was aimed at someone other than you.

      But I don’t get you guys. Nobody yells at me when I review a GTX 970 or an R9 290X and recommend it. But then folks want to pair these things with a 1080p display? And get hostile when someone suggests otherwise?

      Look, a big, hi-res display is great for gaming, for web surfing, for editing photos and videos from modern capture devices with hi-res sensors, and a whole range of other things. If you’re not a power user, fine, don’t go there. Not everybody can afford it. But for a lot of folks, if you have a 2500K and some savings, I’d advise skipping your next CPU upgrade cycle and getting a new monitor.

      I dunno. I know from the experience of doing web design changes around here that a lot of nerds can’t see very well. We hear generalized criticisms and then find out later the dude complaining is mostly legally blind. Maybe people with compromised vision can’t appreciate better displays. Could be one source of the differences of opinion. Just… I hope it’s not total lack of experience with a nice, big monitor. Using one will convert you, if you’re able to appreciate it.

        • anotherengineer
        • 5 years ago

        lolz, true enough. And I agree with you to a point, a nice monitor is probably one of the best upgrades you can get since it gets used 100% of the time, well that much if one is not blind, and usually lasts 5+ years.

        It’s just these particular monitors are massive dollars compared to say a cheapy 1440p korean one, or quality 1080p ones.

        I am actually waiting for reviews on the Dell and Benq 23.8″ 1440p monitors. Unfortunately I don’t think they will be coming with adaptive sync though. But good ppi and hopefully decent price.

        Sigh, where are the 60Hz adaptive sync screens?!?!?! And where are the affordable 120hz monitors?!?!? 6 year old tech and more expensive now then when it came out!!! Sheeesh!!

        • sweatshopking
        • 5 years ago

        I hope you’re not lumping me in with the “hostile” group. You know I have nothing but deep, heartfelt love for you, and I thought the review was great.

        From my perspective, I don’t think the 290x or 980 are fast enough for new games @ 1080p (as you’ve said, you disagree, and really, you’re the expert, not me).

        The GTA V devs say [quote<] To be able to run the game at 4K resolution at 60fps you’ll need a high-end SLI or Crossfire setup. [/quote<] now, this isn't 2160p, but I guess I don't think GPU's are fast enough yet. I WANT variable sync and I appreciate the higher resolution (like I said, it's great for excel), but given the current high costs, and how I [i<] feel [/i<] gpu power just isn't there YET, imma hold off. In 2-3 years, I imagine I'll pick one up with variable sync and higher resolution to replace my likely dead HP IPS 75hz screen.

          • anotherengineer
          • 5 years ago

          I might have a free CRT if you want it when your LCD dies 😉

            • sweatshopking
            • 5 years ago

            it’d cost too much to mail it.

            i actually just got mine replaced. bought an HP monitor 14 months ago, died after 13 months, and only had a 1 year warranty. I bought it at futureshop AND COULD NOT FIND THE RECEIPT ANYWHERE. contacted HP, they said “send us your receipt and we’ll mail the replacement out right away!!”
            I couldn’t find my receipt, but the next day UPS showed up with my replacement. So i mailed back the broken one. new one does have quite a bit of backlight bleed, but c’est la vie.

            • anotherengineer
            • 5 years ago

            Know what’s worse is those crappy Fshop receipts fade after 8 months, I have a few that are not even 1 year old and it looks like they were erased, seriously. What are you supposed to do if it’s a 3 year warranty?!?!

            • JustAnEngineer
            • 5 years ago

            Scan it before it fades? Store in a cool dry place? Charge it to a credit card with its own receipt and warranty extension?

            • anotherengineer
            • 5 years ago

            Credit card, just shows up as a purchase for $$, it does not say what items were purchased.

            Was thinking about scanning, but not sure they would accept a ‘non-official’ receipt.

            Sad thing is I have some receipts in my glove box from 10+ years ago (battery, etc.) that are still good. I think it’s a combination of the crappy paper and ink futureshop use, and a few other stores are starting to use also.

            I do think though, some places are offering to email an e-receipt, about time!!!

            • sweatshopking
            • 5 years ago

            most places will accept a picture of receipts. I take pictures these days and have submitted them. Never had a problem.

            I always try to get emailed receipts now.

        • Meadows
        • 5 years ago

        Sure, I’d love to use one of those screens. However, I refuse to play any of the new or recent games at anything less than “Very High” detail, or equivalent, or better. (I only let go of the highest texture detail settings of the most recent titles, since my GPU has only 2 GiB of memory.)

        And, well, 1080p is just about the best my machine can do using such settings. And that’s completely *without* antialias. Any of it, even post-processing like SMAA.

        This 660 Ti may be two years old now, yes, but as of today there’s no good deal in sight to let me replace it. It’s been overclocked by 30% (!) from NVidia’s stock reference and continues to soldier on just fine as described above. There’s no more juice to be had, it’s all been squeezed out.
        I’m actually very satisfied with it for still being able to run some of the latest titles at basically the best detail levels (sans AA), but to use anything more than 1080p would probably light it on fire, I imagine.

        There will come a time when I’ll upgrade the GPU again, maybe even this year if something convincing comes out between the GTX 980 and 960, but that purchase will *not* coincide with getting a new monitor as that’s just much too expensive in one go.
        Let them halve these monitor prices first and then I’ll reconsider, but right now it’s thoroughly in the “early adopter” zone and hard to recommend to most people as a result.

          • sweatshopking
          • 5 years ago

          OMG MEADOWS!!! WE’RE THE SAMEEEE!!!

          Only I have a 290.

            • Meadows
            • 5 years ago

            I’m a member of the Glorious Green Team Master Race.

            • anotherengineer
            • 5 years ago

            You’re a little green alien??

          • Mentawl
          • 5 years ago

          So you’ve made a choice between quality effects and screen resolution, same as everyone else makes. You refuse to compromise “quality” for resolution. Fair enough.

          Personally I’d choose higher resolution with lower details over low resolution and high details. There are two sides to the coin, just because you’ve chosen one doesn’t mean it’s the right or wrong one. Of course ideally you want both, but the Real World (TM) is a harsh mistress :P.

            • Meadows
            • 4 years ago

            Not an especially high resolution, sure, but calling it outright “low” is going a bit far.

            For example, I sit farther from my monitor than most people, and although I have flawless eyesight I still have trouble noticing pixelation, even without AA, even in the OS. There needs to be a very high contrast edge somewhere or an especially noisy pattern (for example, detailed tree crowns at a moderate distance in an FPS game) before I even so much as notice anything. And even then, it never distracts me.

            The only way 2.5K or 4K would conceivably help me is by giving the same overall effect over a larger screen surface, but that’s kind of wasteful for reasons I’ve described in my other replies. Plus you can only ever focus on one small blotch at a time in your actual field of view, so a lot of that extra processing is never even seen.

            It would probably be a godsend for productivity, but otherwise it’s wasteful for gaming unless you have a GPU with a price tag rivalling that of the monitor. Which some people might have indeed, but that’s far from the norm, therefore strongly suggesting this adoption of new standards was a rather excited move by the author. He admitted as much, though. Of course, the suggestion will eventually end up being right but I doubt this will be the year when that happens.

        • chuckula
        • 5 years ago

        [quote<]Nobody yells at me when I review a GTX 970 or an R9 290X and recommend it. [/quote<] Oh, I beg to differ. I've seen plenty of fanboys from both camps go catatonic when you have the nerve to recommend anything from Nvidia or AMD when it comes to graphics (Intel's never been an issue since I don't think you've ever recommended them!). From here on in: No GPU reviews for anything that costs more than $100!!! 😛

        • Andrew Lauritzen
        • 5 years ago

        As someone who switched to a 30″ monitor when they were still north of $1000, I do agree with you in terms of the importance of a good display. And I’m definitely onboard for both higher refresh rates and adaptive sync (honestly, the former is probably the most exciting).

        That said, there are a few things that still need to be worked out before I’m diving in:
        1) FreeSync vs. G-sync… this one is obvious but there’s no way I’m putting money into a display that will likely last ~10 years without guarantees of support *regardless* of the GPU I happen to have at any given point over that time. Completely unacceptable for my monitor brand to be tied to my GPU.
        2) Proper OS support. Needs to work in windowed mode, with FLIP swap chains, panel self refresh, etc. Full screen exclusive needs to die.
        3) I’d really like to see a min refresh range of ~30Hz. At that speed I’m not too concerned about what happens “below” it, but 40 is slightly too high.
        4) Decent pixel density, but I don’t necessarily need or want 4k+, even @ 30″.
        5) No scalar 🙂 This is harder and harder to find but these provide zero value for me and all they do is add input lag, even if it’s minimal.
        6) No multitile nonsense obviously.

        So yeah – I’m excited and ready to dive into this new tech, but it feels like we’re still at least ~6 months from when it gets properly standardized and confidently supported broadly.

        In any case it was a great review and I’m happy to see that the technical underpinnings of this hardware are solid at least!

        • Cuhulin
        • 5 years ago

        I agree wholeheartedly about the large monitors for those of us whose sight is not what it was when we were 19. They are much easier to read. The downside is pixelation if the resolution is not increased correspondingly, so we need high rez.

        Is it costly? Sure. And some can’t go there.

        Is it worth it. Also sure.

        Which leads me to the IPS-TN battle. I’ve used both. In my view, the real issue is not the panel type, it’s whether it has full gamut color and uses it. I’m using a 40″ 4k panel on my home office desk and will replace it again when HDR content is widely available – and for that, the choice of panel tech is not really relevant. I’d like to move to OLED, but I may not live till then 🙂

        • Ninjitsu
        • 5 years ago

        [quote<] But then folks want to pair these things with a 1080p display? [/quote<] Of course! They can just about keep 60 fps at these refresh rates. See, please don't take any of this as hostility it's just...well, criticism, and hopefully coming across as constructive. When you recommend a GTX 970 it's usually something that's just enough for most of us (hence why I always encourage 1080p benchmarks!) and something that will end up being future proof for another two years at best (yes, at 1080p). A monitor on the other hand is a long term investment, something that's only replaced when a new one is NEEDED: -old one goes bad -need more space for productivity -sold old one with computer -extra screens Gaming is something very, very few of us make any money from, so it's just something that we invest in when we [i<]must[/i<] or when we have surplus money. And a new, higher res monitor almost always requires a new GPU to keep up with the same performance. You'll see a similar trend when people upgrade CPUs - if they have to change the motherboard [i<]and[/i<] RAM to upgrade, they'll keep delaying it till they really need a faster system (or have enough extra cash), because it's more expensive. So saying that "it's 2015 so upgrade" or "most of you have the same thing so get something 'better'!"...it's just not something I'd tell someone, even if my heart was in the right place. Like SSDs! I keep telling people to get an SSD! But they don't, because they can put a 1TB HDD in their laptops instead, for half the price of a 256GB SSD. I, on the other hand, have 3 SSDs and a HDD in my desktop, plus another SSD in my laptop, [i<]and[/i<] 1.5TB of external storage split b/w two drives. I know some people who'd just rather delete stuff than get an external drive. Anyway, I digress.

        • Jason181
        • 5 years ago

        Given the choice, would you choose framerates < 60 fps on a 2560×1440 monitor with g-sync, or ~120 fps on a 1680×1050 monitor without g-sync?

        Here’s the conundrum: To get 120 fps on a 2560x1440p monitor takes the same graphics power as 60 fps on a 4k display, so that means at least two GTX 980s or R9-290x when AMD starts supporting variable refresh on crossfire (plus the cost of a new monitor).

        Assuming most gamers considering this sort of setup already have at least one pretty powerful card, that means one additional high-end card and a monitor in the $650-$800 range to achieve 120 fps. So you’re looking at ~$1,000 minimum.
        If, in your opinion g-sync mitigates the need for 120 fps, then it would just be the cost of the monitor. I admittedly don’t have any experience with g-sync/freesync or a 1440p monitor, which is why I’m asking.

          • Damage
          • 5 years ago

          I think you’re maybe overstating the starkness of the choice here.

          In my review of the GeForce GTX 960, I found it to be pretty good generally for gaming at 2560×1440. That’s a $200 graphics card that performs similarly to a GTX 680/770/7970 from years past. I’ve gamed at 2560×1440 on that class of card for years and enjoyed it.

          I’d say the combo of a GTX 960 and a 2560×1440 VRR display could be quite a nice thing. Would you always hit 144 FPS? Nah, but that’s not what matters. Avoiding slowdowns and general smoothness is what matters for good gaming. A VRR display lets you have the best experience your graphics card can manage, which makes it more forgiving than a quantized display.

          Now, even higher GPU performance is still nice. Consistently staying in the 10-15ms range for frame times is smooth and extra responsive. But most really fast twitch games don’t need a heavy-duty GPU to get there–and many others can be tuned for it for multiplayer.

          So I guess I’m saying you need a balanced config, but my sense is that a $200-250 graphics card and a VRR dsiplay at 2560×1440 would get you to a pretty good place. That’s not a cheap total, but I’d stick with say a GTX 680 and a 2500K for another year or two if I could upgrade from 1080p60 to 2560×1440 VRR.

            • anotherengineer
            • 5 years ago

            “Avoiding slowdowns and general smoothness it what matters for good gaming.”

            Real bummer is that if free sync purportedly only costs an extra $15-$20 and is easy to implement, I don’t know why it isn’t being implemented on basic run of the mill 60Hz 1080p screens. Then at least for gamers on a budget they could get that “general smoothness” below 60 fps.

            Maybe in the future we will see freesync/adaptive sync on even basic 1080p monitors? Maybe because “gaming” gear means 144hz and or expensive??

            Fingers crossed for wide spread support for DP 1.3 and adaptive sync in all future monitors.

            I would rather have that than built in monitor speakers or even a monitor USB port………………

            • Damage
            • 5 years ago

            I think Adaptive-Sync support could very well become ubiquitous and essentially zero-cost in the next few years. Naturally, the monitor guys are adding this feature first to their high-margin gaming products, but given how it works, this feature should be able to make its way into everything eventually. May take some time for it to be proven and refined to where it becomes mainstream, though.

            • jessterman21
            • 5 years ago

            Oh man, I hope so. I can’t justify spending over $200 on any computer part or peripheral since that’s usually where price/performance parity ends.

            It would be great in a few years if I can get a fast terabyte SSD and an Adaptive Sync 1440p monitor at the $199 price point.

            • Jason181
            • 5 years ago

            Thank you. That answers my question. Perhaps the serious hardware addiction and desire to have it all was rearing its ugly head.

            I think based on your advice a 1440p g-sync monitor (I already have a recently purchased GTX 980) like the [url=https://techreport.com/review/26870/asus-rog-swift-pg278q-g-sync-monitor-reviewed<]ROG Swift[/url<] to replace a 1680x1050 120 hz monitor is the way to go. I think you're also absolutely right about the 2500k (I have a 2600k that I bought when they were first released).

            • Damage
            • 5 years ago

            Don’t sleep on the Acer XB270HU. IPS and 144Hz. Only $20 more than the ROG. Out now.

            • Jason181
            • 5 years ago

            Thanks for the info. I’ll wait till it’s in stock. Seems to be sold out everywhere (or everywhere I trust, anyway).

        • Anovoca
        • 5 years ago

        Yeah I don’t know why people think they need to have the horsepower first before getting a nice monitor. No one said they have to run a 1440 monitor at 1440 or set a 144hz monitor to a 144 refresh rate. Even if they are using a 5 year old 1080 TN panel, by getting a new 1440 TN panel and running it at 1080 res they will see a significant improvement if for no other reason then they have a monitor with new lamps.

        I never understood the argument that you need to have the horsepower before you get the tools to use it. When what the last time you heard someone say they were only going to buy a 250gb hard drive because they only had 248gb of data to store.

        • End User
        • 5 years ago

        HOW DARE YOU BUY SOMETHING HIGH END
        HOW DARE YOU THINK YOU NEED THAT
        HOW DARE YOU SPEND MORE MONEY THAN I WOULD
        HOW DARE YOU UPGRADE WHEN WHAT YOU HAVE IS PERFECTLY FINE
        HOW DARE YOU SHOW ENTHUSIASM ABOUT A NEW PRODUCT
        HOW DARE YOU
        HOW DARE YOU
        HOW DARE YOU

        You really need to start The Cheapo Report to satisfy a large % of TR readers.

          • sweatshopking
          • 5 years ago

          you should whine and complain more. there isn’t enough of it these days.

        • blastdoor
        • 5 years ago

        [quote<]We hear generalized criticisms and then find out later the dude complaining is mostly legally blind.[/quote<] This quote caught my eye (so to speak) because I am, in fact, borderline legally blind (even when wearing glasses). The funny thing is --- I have discovered (contrary to my expectations) that I love high resolution displays (in the class of what Apple calls "retina"). This is for basically three interrelated reasons: 1. I get closer to the screen than others in order to compensate for my low vision 2. Reducing fuzziness in text is always good 3. I use the built-in zoom accessibility feature on Macs and iDevices, which essentially gives me the same ability to see pixelation that everyone else has -- so the more pixels, the better In short, because of my various efforts to compensate for my bad vision, I end up benefiting from higher resolution.

        • credible
        • 5 years ago

        “But for a lot of folks, if you have a 2500K and some savings, I’d advise skipping your next CPU upgrade cycle and getting a new monitor.”

        Exactly what I did, in part because of something you said in a past graphic card review, got me a Dell U2712HM and for me its a close second to my ssd as best upgrade I have made.

      • Prestige Worldwide
      • 5 years ago

      I think the PR departments of GPU makers want tech sites doing benches on 2160p monitors these days. It makes no sense for most gamers but here we are.

        • JustAnEngineer
        • 5 years ago

        I believe that reviews have to be at least somewhat future-leaning: Predict where this graphics card will perform when you [b<]do[/b<] finally get around to upgrading to a 1440p or higher resolution monitor. 1440p is the current sweet spot, in my opinion. 2160p still carries a price premium.

      • Anovoca
      • 5 years ago

      I totally agree. I also think Jennifer Lawrence is unattractive due to my low testosterone levels.

    • Ninjitsu
    • 5 years ago

    [quote<] The time has come for an awful lot of folks to consider an upgrade. [/quote<] Why? I don't understand. Most GPUs only now seem to be capable enough of delivering 60fps at 1080p, with all settings maxed out. Moving to a higher resolution panel with SomeSync costs a lot more money than a GPU perfect for 1080p. So unless people actually NEED/WANT higher resolution, and are ready to additionally invest in a GPU capable enough of driving it, I'm not sure there's much of an implicit reason to upgrade. I mean, if you're still using a low-res CRT then sure - but otherwise? I suppose you'll already know if you want a display. I bought my 1080p monitor in 2013 or early 2014 I think, that too primarily for productivity reasons. 1024x768 is kind of small for multitasking.

      • sweatshopking
      • 5 years ago

      i’m with you, bro. I don’t get it. I’ve had 1440p, and it’s nice, but unless you’re a full time excel worker, 1080p is enough for me.

      • _ppi
      • 5 years ago

      Given current Windows ppi scaling limitations, what I see as key parameter is screen size, and that you go pair with optimal resolution.

      Win10 and the universal apps might improve that, which will be time to go for 4K (and maybe downscale to 1080p for games).

    • derFunkenstein
    • 5 years ago

    so if AMD’s solution uses Vsync on out-of-bounds frame times, and if it works without Scott feeling a transition (which he says is true) then there’s a technological benefit that users can feel, even before you get to the “this shouldn’t be much of a price premium” which we’re yet to observe.

    Scott: what’s that pink port below the DVI? Is it a second microphone input or something? There’s already one on the headphone jack side.

      • Damage
      • 5 years ago

      Uh, yeah, it’s a mic jack. I think it would go to the PC maybe, to carry sound to an input. Not clear on this.

        • derFunkenstein
        • 5 years ago

        Oh, I see, if you’re confused enough to use this with DL-DVI instead of DisplayPort, since DVI doesn’t carry audio. Thanks.

    • southrncomfortjm
    • 5 years ago

    Freesync delivered. Awesome!

    Now, which large TV maker is somehow going to get this working on a 60inch LCD or OLED? I’d pay good money for that.

    EDIT: Looks like Panasonic tried this already – [url<]http://www.soundandvision.com/content/panasonic-4k-first-displayport[/url<]

      • UberGerbil
      • 5 years ago

      The trouble is, there isn’t much incentive for the TV makers to do that. TVs are connected to gaming consoles, cable boxes, and AV equipment. None of those output VRR content. The relative handful of HTPCs in the world simply don’t matter. The only way TVs would pick up the feature is “by accident” — if the video processing circuitry they get from their vendors happened to include it (and the mfrs didn’t turn it off).

      The real push has to happen at the source end: you want the next generation of game consoles to not only offer variable refresh rates but make a big deal out of marketing it. That would create enough of an incentive for the TV makers to jump on the feature as a competitive advantage (and also a justification for consumers to go out and replace their TVs). There are a few other source products (Chromecast, etc) that might be able to take advantage of it, but it’s the game consoles that would really push the market.

        • southrncomfortjm
        • 5 years ago

        Oh yeah, I know its not worth their time now, doesn’t mean I can’t wish they’d just up and do it.

        Getting consoles to go along with variable refresh would be great, but I don’t think they will, mostly because consoles tend to lock in around 30fps, nevermind getting to 60 or even 40, so they’d constantly be below the VRR threshold.

        The real team I’m rooting for here is Valve since they are the one company with the clout to really push couch PC gaming to the mainstream. If Valve can get millions of people using SteamOS to game from their couch (and that’s a big if), I’m sure they could convince some of the big TV makers to toss a compatible displayport port onto a large screen TV. Until then, I’ll just be content trying to get my TV to refresh at 120hz, which actually works on some TVs.

        Also, its not like putting PC inputs on TVs is a new thing. All my older TVs have VGA inputs. Why not a displayport input?

    • sweatshopking
    • 5 years ago

    I won’t upgrade my monitor until i can run games on high @ 1080p and still get a reasonable frame rate. I have no desire to go up to 1440 or 2160 so i can play games on medium @ 30fps.
    Variable sync is nice, but variable sync @ 1080p is totally fine in my books given the current power of GPU’s.

      • derFunkenstein
      • 5 years ago

      Your R9 290 should push a 1440p display no sweat.

        • Ninjitsu
        • 5 years ago

        It seems to be just about managing 1440p. Almost all results are under 60 fps.
        [url<]http://www.anandtech.com/bench/product/1068[/url<] Heck even the 290X is all over the place, but generally closer to 60 fps. [url<]http://www.anandtech.com/bench/product/1439[/url<]

          • sweatshopking
          • 5 years ago

          yeah, bioshock isn’t even 60fps. Who wants that?!

          • derFunkenstein
          • 5 years ago

          wtf I seem to remember it being faster than that in reviews. Statement more-or-less withdrawn.

          Guess I need to give my GTX 970 a bit more of a hard run.

            • Ninjitsu
            • 5 years ago

            Guess it depends on the games, really. Like, the 980 seems to be just perfect for 1080p but it’s obviously too expensive when compared to a 970, which just about manages 60 fps minimums like the 290X.

            • derFunkenstein
            • 5 years ago

            Personally I’m OK with dialing down details a little bit to get my display’s full resolution with vsync enabled. On Diablo 3 I’m perfectly happy with “low (smooth)” shadows just so when things get super hairy the framerate doesn’t get cut in half. On StarCraft II, that means “high” instead of “extreme”. Actually, in most games that means “high” instead of “uber high” or whatever, and occasionally like in Crysis 3 it means medium.

            This is with a GTX 970 on a 1440p display.

        • sweatshopking
        • 5 years ago

        and yet, games run like crap at 1440p

          • derFunkenstein
          • 5 years ago

          OK yeah, I guess i missed that you wanted to max everything out. Makes the “master race” people seem kinda dumb. 😉

      • Ninjitsu
      • 5 years ago

      Yup, my thoughts exactly. I don’t see the point of upgrading just because two thirds of people on a tech site have a particular resolution.

      • Damage
      • 5 years ago

      [url<]http://i.lvme.me/ullaqld.jpg[/url<]

        • sweatshopking
        • 5 years ago

        we’ve had this conversation. The games I play don’t run all that well on high @ 1080p. I had a 1440p monitor for roughly 3 months, while I appreciated it on Excel, I dropped the gaming resolution to 1080p in almost everything in order to keep graphics settings up and fps to a decent level. I have a 75hz IPS display, and until I can REALLY push more pixels and maintain graphics settings and FPS i’m not upgrading.
        I’d be quite happy with variable sync and 1080p, but no desire to go beyond right now.
        700$ for a monitor is a LOT of money, that isn’t EVEN CLOSE TO MY BUDGET, and even if money wasn’t an issue, i can’t imagine spending that kind of cash for the slight increase in pixels. Again, I had a 1440p screen, and while it was [i<] nicer [/i<] it wasn't nice enough to spend the extra cash AT THIS POINT. in a few years, when my ips dies and the price has fallen, sure. But today? Not to me. [url<]http://www.pcgamer.com/total-war-attila-on-extreme-settings-is-designed-for-future-gpus/[/url<] [url<]http://www.sweclockers.com/image/diagram/9228?k=d17ee4f16645b113623e92622847e850[/url<] Atilla @ 1440 on high 35fps? nah thanks. [url<]http://www.sweclockers.com/image/diagram/9229?k=5f79f2521ae4d2a517d134a47b32ad72[/url<] @ 1080p on high this is bad enough.

          • anotherengineer
          • 5 years ago

          I can see your point. I bought my 120HZ samy RZ2233 over 5 years ago, and it’s 22″ 1680×1050, and can push the frames. I would rather have that than have a 1440p at 30fps. Now with g-sync and adaptive sync with 1440p will give the impression of a fluid 120hz though. But at this point in time, it is a luxury. I am not a pro gamer, and don’t need a new vid card or a new monitor. I don’t think my wife and kids would be impressed if they got nothing for their Birthday & Christmas because a new vid card and monitor blew the budget so I could play games, just more fluidly than I can now.

          However in the future, when my vid card and my monitor does have to be replaced, I think it would be nice to have a adaptive sync card, especially if there will be future inexpensive 60hz ips 1080p panels with adaptive sync that will hopefully do 24hz, because then you can game on the cheap with fluid motion. And it is definitely good to see technology moving forward.

          I am really hoping in the future they get adaptive sync down to 20 or 24hz and the implement it on inexpensive 60 hz 1080p screens.

            • sweatshopking
            • 5 years ago

            Sure, i totally agree with this. Monitors/gpu’s aren’t cheap, and until prices fall SIGNIFICANTLY and I’m monitorless I can’t justify the change. I will want a much better GPU before I consider going above 1080p though.

            • anotherengineer
            • 5 years ago

            I am running my 1080p 60hz Dell P2214h on my main machine now, and do miss the 120hz sammy which is on the spare pc. However I do like the colours, viewing angles and pixel density more on the Dell P2214h. I also prefer the 16:10 of the sammy.

            If only I could get a 1920×1200 about 20″ with 85hz min, preferably 120hz, ips panel with adaptive sync for $250 or less.

            • sweatshopking
            • 5 years ago

            Yeah, 250$ is all i’m every going to pay for a screen. The laws of diminishing returns comes into effect about there.

            • derFunkenstein
            • 5 years ago

            There are other reasons for higher resolution, but if all you do with your PC is game then I guess I see your point. I went to a 1440p display for the extra desktop space while running a VM and using several office-type apps for work.

            • sweatshopking
            • 5 years ago

            Exactly. Desktop usage. It’s [i<] awesome [/i<] for excel.

        • DPete27
        • 5 years ago

        I think you’re ignoring the COST portion of the equation. I myself would LOVE a 27″ 1440p variable refresh monitor like this, but when you factor in the $600 cost of the monitor AND a possible $350+ GPU upgrade that can push 1440p at 60+ fps with high detail, that pill gets a bit tough to swallow.

          • sweatshopking
          • 5 years ago

          the only card that can push 1440p on NEW games consistently at 60+ fps is a titanX. it’s not $350. Nevermind games that’ll come out in the next 6-12 months.

          I don’t think he’s ignoring the cost. If you want these features and tech, this is what it costs. He says it’s a solid upgrade, and I think he’s likely right. Whether YOU or I are willing to pay for that upgrade or can afford it is a separate question.

          • anotherengineer
          • 5 years ago

          The point of free-sync/g-sync is the low refresh rate. Which is why I wonder why they are not implementing free sync and g-sync on regular 60hz monitors.

          Also those screens are like $800 here in Canada, and a GTX 970 after tax is typically $500 range, so ya, $1300+ gets a bit out there for smooth gaming at high res and low/hi refresh rates.

        • Milo Burke
        • 5 years ago

        -1? This may be Damage’s least popular post of all time!

      • ptsant
      • 5 years ago

      In my experience, the difference between “high” and “ultra” or even “medium” and “high” is not that great. I would seriously consider going down to “medium” in order to get a better frame rate at 1440p. The improvement would come from variable refresh, not from the higher resolution.

        • sweatshopking
        • 5 years ago

        Sure, and that’s the line people usually give me when I say this. I disagree though, and i’d rather have ambient occulsion, high quality shadows, etc. vs slightly better resolution.

        Like i said, i look forward to variable sync, and think it’s much more interesting than a better resolution. g-sync or the likely doa freesync look interesting, and in a few years when my 1080p IPS dies I will pick up a monitor which supports the winner.

        • Ninjitsu
        • 5 years ago

        Medium looks horrible in most games*. High is probably the minimum I play at. In games like Arma 3 it’s a downright disadvantage if you can’t at least turn most stuff to very high and ultra at your native resolution.

        EDIT: *[u<]To me.[/u<]

    • superjawes
    • 5 years ago

    So…it looks like Nvidia’s goal in all of this wasn’t nefarious, but it wasn’t exactly wise.

    G-Sync does appear to be superior, even if that is only by a small margin, and Nvidia have thought through problems and come up with good solutions. The problem is that despite going for the “best” solution in terms of performance, FreeSync is still a massive improvement, and if AMD certified displays continue to run below G-Sync display prices, then there’s not much point in paying the bigger premium.

    So we’ll see what happens…I suspect that Nvidia will have to implement something that utilizes DP’s adaptive capabilities just to maintain market coverage.

      • sweatshopking
      • 5 years ago

      I think they likely won’t. AMD has what, 25% of the market? Nvidia controls 75% of the gaming gpu market. They don’t have to do anything. They will win thanks to market share. freesync is DOA.

        • superjawes
        • 5 years ago

        Eh, even if Nvidia have a lead, the easier solution could just be adopted for buzzword compliance, and one thing that I’m hoping for is adaptive sync in televisions for content differences. (That might actually be on HDMI to adopt and implement, though.)

          • sweatshopking
          • 5 years ago

          Maybe, but only if it gets picked up by intel. If they don’t implement aggressively g-sync will win.

          • Ryu Connor
          • 5 years ago

          HDMI support isn’t the only hurdle. You’d need the technology to work at lower frame rates for it to be useful for 30hz and 24hz content.

            • superjawes
            • 5 years ago

            If the content is fixed, I’m really not worried about that, as you can multiply the refresh rate to something higher (so 24 FPS content would refresh at 48 Hz, with each frame shown twice). Gaming has bigger issues because nothing is fixed, and you want maximum fluidity on refresh times to get maximum animation fluidity.

        • the
        • 5 years ago

        Well if Intel steps into the ring and adopts adaptive refresh via DP 1.2a, then the tables can easily turn. While no serious gamer uses Intel graphics (well unless that’s all they have in a laptop), Intel has a large enough share of the total graphics market to be the market leader. Remember that not everyone is a gamer.

        • _ppi
        • 5 years ago

        It does not matter what is GPU market share. It matters what is monitors market share.

        The second you have Freesync display, unless AMD seriously underperforms (think GF 8000 series vs. Radeon 2000 series), AMD is then clear default choice for such users.

    • tsk
    • 5 years ago

    Yayy finally! I don’t quite agree with you Scott that freesync is good enough below the VRR window. But then again I’m in my twenties and have falcon eyesight, so that might be a deciding factor 😉

      • Damage
      • 5 years ago

      You have used both FS and GS in real games and noticed a big difference? How were you sure you weren’t seeing a GPU performance difference?

      Edit: Note this is a discussion about the relative merits of display tech. I still recommend tuning your games and having enough GPU to ensure you’re above 40Hz generally.

        • tsk
        • 5 years ago

        Good point about the GPU performance difference, although from what I’ve personally seen then yes, there is a substantial difference below VRR window.

        Edit: I don’t think this is that much of a problem tho, like you said, you should be tuning your settings to stay within the VRR window. A minimum of 30hz would be nice tho.

          • Topinio
          • 5 years ago

          Well, the theoretical range is a factor of ~6.76, e.g. 36-240, 21-144, 17-120, or 9-60 Hz; seems maybe the first generation scalers can’t quite manage that though so it’s ~40-144 Hz IRL. Not bad, though.

          If I knew that the BenQ’s height range took it under 510 mm when down on the stand I’d’ve ordered one just now based on Scott’s excellent review. (BenQ’s spec page is lacking and the manual is still not up on their site; the monitor’s been available in the UK since before the driver came out and has been tempting me … now there’s a decent review I can trust, it’s on my probable purchase list.)

      • xenonite
      • 4 years ago

      I have been using the Asus ROG Swift for the past couple of months on an overclocked dual 780ti SLI system and one thing I simply cannot understand is how people with supposidly good vision, even trained reviewers that also know what they are looking for, find 30-40 fps G-Sync acceptable to create an illusion of smooth motion.

      I have struggled with monitor-induced migranes and nausuating motion sickness for most of my life, such that I could never comfortably play any computer games at 60 fps for more than a few minutes at a time. 144Hz variable refresh rate technology (G-Sync and Freesync) has finally given me the opportunity to game comfortably at 110-144 fps for up to 2 hours at a time.

      Contrary to the popular belief, that variable refresh rates have the biggest impact at low framerates, I would not discribe the experience as being even close to realistic “smooth motion”, but G-Sync has had a very big impact in reducing the framerate-mismatch induced juddering above 110Hz (which was unplayable for me unless the game could reliably run at a fixed 120fps for an extended amount of time). I can clearly tell when the framerate drops below about 100 fps on the ROG Swift (and I do not have particularly good eyesight) which makes low-framerate gaming seem just as pointless as it did before the arrival of G-Sync. Therefore, I don’t think that the lower framerate limits of this technology is as big of an issue as people seem to think, simply because the illusion of smooth motion gets destroyed long before you reach the 30-40 fps zone.

      Granted, some people do probably perceive sub 100Hz refresh rates on sample-and-hold type displays as exhibiting a smooth motion blurring effect, but I also believe that many consumers (like myself) experience that same sequence of frames as an object exhibiting discontinuous, jumping motion with a very fast oscillatory behaviour (at one discrete position before and after the step location) around every new step location, which makes it very hard to track and accurately aim at.

      Sadly, it seems that the common myth of “the human eye cannot see more than <insert arbitrarily low numerical value here> fps, has doomed the future of truely smooth, virtual-reality like motion for people like me as most game designers (and even my fellow electrical engineers) seem to have stopped pushing the boundaries of high-framerate displays, believing that 120-144Hz is enough to convey a truely smooth illusion of motion.

Pin It on Pinterest

Share This