Personal computing discussed

Moderators: renee, morphine, SecretSquirrel

 
Toby
Gerbil In Training
Posts: 4
Joined: Mon Dec 23, 2013 12:36 am

GSync Install, Thoughts

Sat Jan 04, 2014 10:57 pm

Hey all! I've got the GSync module installed in my Asus VG248QE. I'm loving it sofar; more details and high-speed footage to come. In the meantime, I thought I'd share my install experience:

The box:

Image

UPC! (Ready for the shelf ;) )

Image

Top layer in the box with the GSync board:

Image

Second layer: Instructions, parts, power supply:

Image

Second layer spread out:

Image

Power supply; as you'll see later the original monitor includes an internal power supply while the GSync board does not. Note the company name. :)

Image

Manuals, sticker, pry tool, displayport cable

Image

Gsync board

Image

Gsync board from the displayport angle. Note displayport only.

Image

Gsync board power hookup

Image

The "brains" of the Gsync board. I really wanted to remove the heatsink and take a look under the hood but figured I'd better not this time around. Note the RAM.

Image

Parts and monitor ready to go.

Image

Prying off the back shell. I don't care for this part of modding. (I miss screws)

Image

Back shell off.

Image

Board ready to go.

Image
.
Guts removed from panel. Note the guts are taped on from the manufacturer!

Image

A look @ the "backside" of the original guts. Board on your right is a power supply.

Image

Original board/ps vs. the Gsync board.

Image

If you're installing one note you have to twist the end of the lvds connectors to ensure the cable reaches and that it's in the right direction.

Image

Twist so they look like this before plugging in:

Image

The back reinstalled with Gsync guts inside.

Image

Case back on.

Image

New OSD (looks nice!) 144hz.

Image

I'll save the subjective assessment and high speed videos for later. For now, I'll comment on the VG248QE itself before/after the mod:

- As stated, power supply is now external.
- As stated, only displayport is supported.
- All monitor controls are gone with the exception of brightness. I'm not disappointed by this however, Most monitor options/OSDs have gone off the deep end and the VG248QE is no exception. All the other options can be set with video drivers.
- The OSD is replaced with what you see on the last screenshot. I love this OSD; it states very clearly the resolution, refresh rate (max) and mode (g,vsync).
- The minimum brightness on the monitor is a bit brighter than before, not a huge deal though.
- GSync is enabled via a driver setting under 3d Settings.

More to come!
 
Toby
Gerbil In Training
Posts: 4
Joined: Mon Dec 23, 2013 12:36 am

Re: GSync Install, Thoughts

Sat Jan 04, 2014 10:58 pm

Here's the high speed Vsync Off vs. G-Sync video:


https://www.youtube.com/watch?v=Z4-ApJA0Zag

More to come.
Last edited by Toby on Wed Jan 08, 2014 9:24 pm, edited 1 time in total.
 
Pville_Piper
Gerbil XP
Posts: 347
Joined: Tue Dec 25, 2012 2:36 pm
Location: Pville...

Re: GSync Install, Thoughts

Sun Jan 05, 2014 12:44 am

reserved for more comment.. :lol:
Windows10, EVGA G2 750w Power Supply, Acer XB270H G-synch monitor, MSI Krait Gaming 3X, I7 6700K, 16 gigs of CORSAIR Vengeance LPX DDR4 3200 MHz ram, Crucial 500 gig SSD, EVGA GTX1080 FTW
 
Ryu Connor
Global Moderator
Posts: 4369
Joined: Thu Dec 27, 2001 7:00 pm
Location: Marietta, GA
Contact:

Re: G-Sync Impressions

Sun Jan 05, 2014 2:46 am

Topic merged.
All of my written content here on TR does not represent or reflect the views of my employer or any reasonable human being. All content and actions are my own.
 
Amazing Mr. X
Gerbil
Posts: 13
Joined: Mon Sep 23, 2013 8:44 am

The G-Sync Experience: ULMB, 3D Vision, and 144Hz.

Thu Jan 09, 2014 3:15 am

Moderator / Admin Notice: I apologize if this thread is posted in the wrong place, feel free to move it if it is. My PM was never answered, so I just decided to post it in the most appropriate place I could think of.

Notice: This review constitutes a set of opinions gathered from a week of general usage. No scientific testing was conducted. As such, there will likely be discrepancies between my impression of the experiences and more accurate data. In all cases of science, I defer to the following articles posted by The Tech Report and Blur Busters. These people are far smarter than I and know far more than I do. They also have wonderfully clear and scientifically collected data, none of which I have. If you find my review inaccurate or below your standards, please feel free to read theirs.


What is this thread?

Simple, The G-Sync giveaway run by Nvidia and The Tech Report demands the requirement of giving feedback to the community if you should win the contest. As one of the winners, this thread is where I plan to continuously give feedback. This, of course, will start with just this one post about my experiences thus far. However, it is very likely to expand well beyond that in the future.

What is G-Sync?

This is a question I think most people haven't had answered just yet. Not that I think the technical data surrounding G-Sync is entirely incomplete, but I do believe it's largely difficult to draw any actual conclusions from it. The idea of G-Sync is an extremely old technology, an idea older than VESA standards, born from the birth of computer graphics. It's simply the idea of having a display device refresh synchronously, in time with a computer application. So, if I were to make a program that was to display a duck, this sort of technology would update the screen with the duck whenever the program says the duck is ready to be displayed. It's a very basic way of doing things, which places quite a lot of weight on the programmer getting the timing right.

You see, screens of all varieties have physical limitations. CRTs can only draw a new line so quickly at a fixed resolution, and LCDs need to draw a line slowly enough to overcome transitional blurring. As a result, screens have traditionally refreshed on a fixed cycle, usually measured in Hertz or Hz for short. At a typical 60Hz refresh rate, a monitor will copy the buffer from a video card exactly that many times per second, and draw the contents of that buffer exactly that many times per second. All of this slowness and careful timing induces problems with drawing a program directly to a screen. The problem is that a screen might not be ready to draw an image by the time a program is. If a program updates the screen too quickly, we land up with two different pictures being drawn on our display in a single refresh. This is what translates into tearing, and runt frames. However, there's also the inverse problem. A screen could be ready to update before a program is ready. In this case, the screen merely repeats whatever it has available until something new arrives. This is what translates to stutter and micro stutter.

Um... but was is G-Sync?

I'm getting there. G-Sync is partially a form of V-Sync. Now, V-Sync has been around for a while, and all it does is tell a program to wait until the next refresh is done before sending out the next image. This solves half the problem with display timing, but not all of it. The problem is that this eliminates runt frames and tearing, but does not account for stutter and micro stutter from a program that slows down. Instead, most V-Sync implementations make the problem worse, by making the program wait even longer than needed in order to produce the already late frame. This is where most people get it wrong, however; V-Sync doesn't induce the stuttering, it only makes it worse. The stuttering is always there. With V-Sync off it's more than possible for a frame to miss a refresh and induce a stutter.

All of that said, Nvidia tried to alleviate the exacerbation of stutter using V-Sync, by implementing a new system some time ago. Their answer, Adaptive V-Sync, is a terminally mis-marketed technology that simply disables V-Sync whenever a program slows and misses an expected refresh. This technology does eliminate the usual V-Sync issues, but does nothing to actually stop the stutter entirely, which is unfortunately contrary to the sensationalized marketing material accompanying it. Adaptive V-Sync was largely ignored after launch, but it is currently an actual component of G-Sync. I believe this is important to mention, as I have yet to read anyone else discuss it.

Well, okay, Adaptive V-Sync... right... but what's G-Sync?

Alright, now here's the part where things get interesting: G-Sync is a one-step magical cure-all to stutter and micro stutter, used in conjunction with Adaptive V-Sync to remove all traces of visual artifacts from the display refresh system.

So, um... what is G-Sync?

This is the problem really. It's why G-Sync is so terminally difficult to even describe, and why capturing it in a video for playback on a display is nearly impossible. G-Sync is several technologies and ideas thrown together into a single unified system. At its core G-Sync is actually a promise, more than anything else. It's a plan on the part of Nvidia to dedicate itself to eliminating a set of problems no one has given a name yet. This, I believe, is why G-Sync comes on a programmable chip rather than a dedicated piece of silicon. Nvidia probably will continue to add components to and modify G-Sync as the product continues to move forward. As such, these impressions of mine will likely only hold true in regards to the beta product of G-Sync, and only on the ASUS VG248QE in particular. I do not claim to think that G-Sync will be the same experience by Q2 2014, or even on the 2560x1440 displays recently announced at CES.

G-Sync is a very beta product, and any feelings I have on it now are only in regards to its current beta status.

All of that just for another warning? Come on! Give it to me straight! What is G-Sync?

From my understanding, G-Sync is a set of technologies used in unison:

    Adaptive V-Sync: For eliminating tearing artifacts when a program is faster than a refresh cycle on a monitor.
    Adaptive Refresh Rates: A technology built into the Display Port VESA standard which allows a graphics card to adjust the timing of a monitor refresh rate however it chooses.
    G-Sync: A technology which matches Adaptive Refresh Rates on a monitor to the exact timing of each individual frame in a full screen program from 30Hz to the maximum the monitor allows, instantaneously.

To put it simply, G-Sync is a piece of technology bundled with other technology that solves what V-Sync can't. It's an instantaneous, lag-free system for smoothing the delivery of all frames to the maximum extent that technology can currently allow. It's also the name of the overall package, which includes Adaptive Refresh Rates to make the system possible using existing cable technology, along with Adaptive V-Sync to smooth out the distribution of frames above the threshold the monitor can handle.

But that's just three technologies! You made it sound like there was a bunch of them!

There are.

This is where G-Sync gets confusing. You see, those are the technologies already built into the very-much-beta, fully-reprogrammable, and not-entirely-finished G-Sync technology itself. When you open up the Nvidia Control Panel and go to the Set Up G-SYNC page, hitting enable turns on all of these together.

However, there are other technologies in the physical G-Sync kit which are not yet a part of G-Sync.

As a hint, they're all listed in the title.

What is ULMB?

Ultra Low Motion Blur is Nvidia's official response to the idea of 2D lightboost. It's essentially a button on the monitor which turns on 2D lightboost in one press, without any of the fuss that was once involved in it. ULMB is also significantly better than lightboost was, it lacks the color and contrast issues that plagued lightboost, and appears to work a little better overall.

Before you ask, lightboost was an Nvidia technology designed to eliminate pixel-persistence by strobing the backlight of a monitor in time with the refresh rate. It was designed for 3D Vision 2, and was aimed at eliminating the nasty blurring artifacts which many users complained about with the original 3D Vision. However 3D users eventually discovered, by way of a driver bug, that lightboost could be used in 2D mode to eliminate blur just as well. What this ultimately does in both 3D and 2D modes is to completely eliminate the sort of "LCD blur" which makes people defend and cling to their CRTs. A driver hack was created for people with 3D Vision 2 compatible monitors and spread throughout the internet like wildfire.

Though 2D lightboost was far from perfect, and in-fact, had some very serious problems. For one, Nvidia's 3D Vision glasses are tinted. It's a slight byproduct of the overall design of the kit, but lightboost had to have contrast and color settings modified in order to make videogames appear correct while wearing the 3D glasses. As such, in 2D mode, lightboost gave the wrong color and contrast settings and made games appear oddly washed out.

ULMB corrects all of this. While it functions precisely like lightboost whenever 3D Vision is enabled, using the button on the display to activate the 2D mode turns off the color correction and contrast differences designed for the 3d glasses. As such, it completely removes LCD motion blur with only one actual drawback, a slightly dimmer display. However, on an overly bright panel like the one in a VG248QE, this is hardly an issue.

You're talking about 3D Vision a lot there. What is that? Isn't 3D a gimmick designed to sell Televisions?

That's true. 3D in movies and in television is largely a gimmick designed to sell more useless products to consumers. HOWEVER, Nvidia's 3D Vision is not just a gimmick. In fact, any 3D technology for video games is a fantastic idea.

Why? I don't want things to get flung in my face while playing a game.

That's just it, really. Nvidia's 3D vision doesn't do that. Often, the best 3D Vision titles aren't games where something flies right at you.

No, 3D Vision is really all about depth perception. Basically, in a standard videogame it's really hard to judge distances. When we do judge distances, it's almost always in relation to other objects. This is simply because we're looking at something our brain knows is flat in the surface of a monitor, and while we certainly know that things are in motion and do have some ability to determine distance relatively in these 2D videogame worlds, we lack any actual precision in knowing exact distances, especially when a game world is sparse and contains few objects to compare distance with. This is how a lot of open world games get away with having mountains or sky boxes no taller than anything else in the level. Distance in videogames, and the appearance of depth is a complete illusion.

That is, until you apply actual 3D technologies. At that point, the illusions fall away, and everything becomes as precise in the game world as it is in real life. Take my experience with Eurotruck Simulator 2 as a good example. While it's easy enough to drive one of these trucks in 2D mode, judging the distance to the car in front of you, for a critical task like stopping, can be difficult to do accurately without risking getting it wrong. Try this picture for instance. Could you tell me how far away the truck is from the car in front? How about that car from the car in front of it? How long then, before the truck driver has to start breaking? You could probably guess if you have experience with the game, or even make a fairly educated guess by counting car lengths. However, in 3D mode, I'd only need to glance at this image to give you an answer.

Wonderful a monitor advertizement and an advertizement for 3D Vision. Great. What does this have to do with the G-Sync upgrade kit?

Simply put, the G-Sync upgrade kit includes all of these components. So, should you be considering buying a G-Sync kit for a VG248QE, you'll be buying three technologies: G-Sync, ULMB, and 3D Vision compatibility.

Now, let's drop the Q and A format for a second and discuss why this is important. All three of these technologies could be considered to be critical for playing games. G-Sync eliminates artifacts from the refresh cycle, ULMB eliminates LCD motion blur, and 3D vision allows for precision depth perception. All three of these features have games they work better in, and games where they do practically nothing. All three of these technologies are interesting, and all three of these technologies are propriteary components of the G-Sync upgrade kit and technologies exclusively owned and updated by Nvidia.

However, there's a problem. A big problem. A problem that's difficult to deal with, ugly, and likely representative of the kit being a very beta product.

None of these three technologies work together.

That's my big problem with the G-Sync upgrade kit thus far, and it's not a small one.

Want to play Eurotuck Simulator 2 without crashing into things? Sure! Turn on 3D vision and go! Oh, but don't expect G-Sync to work with it.

Well, okay then, how about a game where G-Sync is more important like Metro Last Light? I've always wanted to play that without stutter, can I just turn off 3D vision when I start that game?

Nope! You have to go into the Nvidia Control Panel and SWAP from 3D vision to G-Sync.

Well, Metro looks great with G-Sync, but I miss my old lightboost mode. Can I turn that on now?

Nope. The ULMB button will give you a nice OSD message letting you know that isn't possible.

Um, so what do I need to do to get it working?

Turn off G-Sync in the Nvidia Control Panel and lower the refresh rate to 120, 100, or 85Hz. Then the ULMB button will work.

What? I can't use ULMB at 144Hz? But G-Sync works just fine at 144Hz!

I know, but apparently the backlight doesn't like the extra 24Hz, and neither does 3D Vision.

What About 3D Vision and ULMB? Can I tick on 3D Vision in the Nvidia Control Panel and use ULMB at 120Hz? I promise I'll turn off 3D vision when the game starts!

Not while 3D vision is enabled. It has to be disabled. It can't just be off. It has to be completely disabled in the Nvidia Control Panel for ULMB mode to work.

That sounds like a mess!

It is.

There's no nice way of saying this, unfortunately. Nvidia's G-Sync kit is an end-user nightmare of continuously opening, navigating, setting, and resetting options in the Nvidia Control Panel between games. If you want ULMB for twitch shooters like Call of Duty or Counterstrike then you have to go disable G-Sync, turn down your refresh rate to 120/100/or 85Hz, and make sure 3D Vision is completely disabled as well. If you want G-Sync for competitive StarCraft II play or an unparallelled experience in Metro Last Light, you need to disable 3D vision completely, make sure ULMB is off, turn up your refresh rate to the highest it goes, and enable G-Sync in the Nvidia Control Panel. If you want 3D Vision for Eurotruck Simulator 2 or War Thunder then you need to completely disable G-Sync, turn down your refresh rate to 120Hz, make sure ULMB is off, and enable 3D Vision through the Nvidia Control Panel.

It's a mess.

A really huge mess.

Enough of a mess to turn people away.

And the sad part is that we haven't even gotten to the hardest and most difficult part of the G-Sync kit yet.

The installation is a nightmare.

Even if you have a fair amount of skill assembling computers, I wouldn't advise attempting to install the G-Sync upgrade kit yourself, not for the VG248QE anyway. I got my G-Sync kit the day before New Year's, and started installation sometime early on in the afternoon. It was only twenty minutes till 2014 before the monitor was back together with my G-Sync kit fully installed. I won't claim to be the most competent of technicians, and I'm sure the video I recorded of the process will make me look like a complete fool when I finally get around to uploading it to YouTube. Still, following Nvidia's initial and somewhat flawed install instructions to the letter resulted in hours of work, actual damage to the bezel of my monitor, and having to take the whole thing apart three times. Even after all of that, I still couldn't figure out how certain pieces were supposed to attach, and despite my best efforts the inside components still don't line up correctly with the holes on the back of the monitor. I landed up with an extra screw out of the install which refused to stay where it was supposed to go, along with a back plate for the new power connector that does not seem to have any way to snap or screw in place.

Overall the install process was painful, long, nerve wracking, and not something I think even the most confident of IT techs should commit to trying without serious research. I certainly wouldn't recommend it to anyone, not even people I don't like, and especially not people who haven't opened up something other than a computer before.

If I had paid 200 dollars for this kit, I'd be mad about it.

Usually when you win something in a contest it's supposed to make you feel good. It's like a present from a website or a company that you love, and getting something like that during the holidays should have made it that much sweeter. Though that's not how I feel about G-Sync. Overall I feel like I just broke even on this thing with my investment in time and relative frustration alone versus the relative payoff of the features rewarded to me. If I had added a cost of 200 dollars on top of this though, I'm not sure I'd still be okay with it. The kit is currently too rough around the edges to be a strong and expensive consumer product, and the install process is too harrowing to safely sell it as an upgrade kit to a monitor.

If you own a VG248QE and you're excited about the G-Sync conversion kits, i'd tell you to either have a professional instal the kit for you, or wait to buy a new monitor with G-Sync pre-installed.

While Nvidia has upgraded the installation instructions and made a video recently, I still believe that this is too much to ask of an ordinary consumer.

It's all too obvious these monitors were never designed to be taken apart, and most will never be able to instal one of these kits properly on their own.

Okay, we get it. You're a G-Sync hater. You probably love Free-Sync, don't you?

No, It's not like that at all. After spending a lot of time with the kit, I have to say that I'm actually really happy with G-Sync itself, and I think that it definitely offers a different and better alternative to V-Sync based alternatives like Triple Buffering while making animation and camera panning far smoother than its ever been in video games. I just didn't want to go into discussing the benefits of G-Sync without giving you a thorough understanding of what's wrong with it.

The truth is, G-Sync is a really amazing technology showcase. Even with all of its maddening quirks, and its difficulties and it's problems, I still find myself liking G-Sync a lot.

So, what are the benefits of G-Sync?

I'm going to cheat and say the first benefit is bringing 120Hz+, ULMB, and 3D Vision to a whole new range of monitors. Before, we had to thoroughly research monitors to discover whether they had specifications like that, but now Nvidia has effectively established a baseline for extremely competent gaming monitors. There's no more trial and error, or guess work, or assuming forum posts are right. Now, all of these features are included on any monitor with the G-Sync branding attached to it. This is a big step up, and a really positive event for the entire monitor industry.

It's a shame that AMD isn't included on the G-Sync tech itself, but it at least appears possible that Free-Sync type tech may work on G-Sync monitors thanks to Nvidia's usage of tech built into the Display Port Standard. At the very least, these monitors have a guaranteed level of high refresh rate support for non-Nvidia users. That's already more than they had before.

For us Nvidia users, G-Sync means a much larger base of players with 3D Vision capable monitors, and a larger pool of people who can try 3D Vision glasses. This means Nvidia is more likely to expand 3D Vision support or update 3D Vision profiles more frequently. When you combine this with the fact that 3D compatibility is built into DirectX 11.1 on Windows 7 and Windows 8, along with the next gen consoles all supporting these feature levels, this all starts to look like a really good future for 3D Vision. We may have a lot more "excellent" rated 3D games in the future, and that's certainly not a bad thing at all.

Finally, Nvidia's G-Sync controller board is gigantic and extremely overbuilt, so it's more likely to take to larger resolutions and faster refresh rates. We're already seeing new G-Sync monitors with 2560x1440 resolutions at 120Hz, and I've heard we may see 1080p monitors at 177Hz in the future. Combine these high quality Nvidia boards with the new upcoming Display Port 1.3 standard and we could see 4K monitors with high refresh rates far sooner than we see graphics cards capable of actually pushing them.

Which actually segues nicely into the next topic.

So, what does G-Sync mean? Specifically, G-Sync the technology and not just G-Sync the kit.

Having G-Sync on is a wonderful experience, in certain games, but it's not really all that noticeable in all games. I noticed it the most in Metro Last Light, which is a fine example of a game that isn't remotely optimized in any way shape or form. On the highest settings, turning on G-Sync in Metro Last Light felt like playing a whole new game. Facial animations were all startlingly well done, giving a clear impression that the characters were all actually speaking. Movements were fast and sharp, even with motion blur on. Swapping out, firing, and reloading weapons was all clean and fast as well. In this game G-Sync gives the impression of playing at well over 200 FPS. It's just faster and smoother than I ever thought possible, and without Nvidia's pendulum tech demo to test G-Sync myself, Last Light instantly sold me on the capability of the technology. There really isn't anything like watching a hundred or so PhysX particles fly around with silky smoothness while reloading in a heated firefight. G-Sync completely removes every last instance of the horrible stuttering we've come to expect from the 4A Engine, and then some. Words can describe it easily enough, but they hardly do it justice. A brick and mortar store would do well to have G-Sync on while looping the Metro Last Light benchmark over and over again on a test system.

Though, there's a problem with using Last Light as the poster boy for why G-Sync is a good idea. Last Light has some of the worst problems with stutter, tearing, and animation quality in the industry. This hardly makes the game fair comparison material, as there are many games with significantly cleaner and better animation quality and less stuttering issues. I'll be testing other games more thoroughly in the future, but I've noticed that F.E.A.R doesn't necessarily benefit in any noticeable way from using G-Sync.

That said, messing with the quality options in Metro Last Light yielded interesting results. The higher the frame rate got in the game, the less profound the difference was between G-Sync and the standard result. On a high refresh monitor like this one, getting closer to 144 fps and 144Hz didn't feel that much different from playing with G-Sync disabled. Though it still definitely felt a bit better. The largest difference I experienced was around the 30 to 60 fps range. In here, The game felt impossibly smooth, and it got me questioning whether high refresh monitors or even high frame rates were really all the necessary in the face of smoothing technology this powerful. If a game like Last Light could stutter like mad at 40 fps, a speed some people consider to be 'unplayable', and yet be rendered totally silky smooth by G-Sync: it makes a very strong case for buying lower end Nvidia cards and putting the money saved towards a G-Sync monitor, or using incredibly demanding features like Driver forced SGSSAA or 3D vision.

The bottom line is that G-Sync obliterates the idea that larger frame rates, or even more evenly distributed frame rates, translates into smoother animation. It completely turns the tables on the status quo, and really changes things. It's incredible technology, and I'm really excited to see where it can go from here. Though I'm also kind of scared that it might not go where it needs to, being as it just feels so incomplete at this point.

What Nvidia needs to do.

G-Sync is terrible as an upgrade kit, and selling it that way should never be done again unless the installation process is done by certified and professional technicians. Though, that doesn't really worry me too much, I don't think too many people will get turned off to G-Sync by buying a frustrating and expensive upgrade kit.

What really needs to happen to G-Sync is just for more of the features in the kit to be included in the G-Sync technology suite. John Carmack, at one of the G-Sync events, talked about bringing ULMB into G-Sync, and I have to agree that this is Nvidia's real next step. There's no reason why a circuit board as huge as G-Sync with a fully programmable chip design can't make an already strobing backlight strobe less-quickly in time with lower monitor refreshes. I can understand that it might not be able to take the extra 24Hz of the LCD panel itself, but even if it just turns off the strobing or locks it to 120Hz when you go over that threshold, these technologies really shouldn't be separate. There shouldn't be any reason why a competitive FPS player would choose a Low Motion Blur mode over G-Sync. The two should really just be designed to work together.

Next, Nvidia has to do something about swapping back and fourth between these technologies. Right now, it's a complete mess moving from 3D Vision to ULMB to G-Sync. 3D Vision doesn't do anything exotic with screen refreshes apart from the backlight strobing, so there's no reason why 3D Vision shouldn't be compatible with G-Sync either. At the very least, turning off 3D Vision with the keyboard shortcut or the button on the IR emitter should swap directly to G-Sync without a user having to go and change over in the Nvidia Control Panel. The way it's set up right now can't be the way it all works in the consumer product. It's a disaster right now.

Finally, G-Sync needs a driver which is more stable. It's not G-Sync's fault, but the last stable Nvidia driver on my computer was 314.22. All of the other ones since have had serious problems, with the most recent two WHQL drivers and the surrounding Betas being the worst. This isn't good considering G-Sync needs 331.77 or higher in order to function. Nvidia's current drivers conflict horribly with my sound card and USB drivers, causing strange USB related Blue Screens to pop up at random with absolutely no warning. They also induce strange stutters which don't seem to go away, even after the driver crashes and resets. The stutters slow down my extremely powerful computer to a crawl reminiscent of a full stop, but tend to vanish after a full reboot. I'm not sure what causes these issues, but I don't have any of them with the 314.22 drivers, and yes I've checked extensively to make sure my overclocks aren't doing it. I have my graphics cards running at stock now and it hasn't increased or decreased the frequency of crashes and stutters under the new drivers. It's not my CPU or my RAM either, as I've also checked those extensively at stock speeds, it's just these drivers that do it. so, until Nvidia finally finds a stable replacement for 314.22, or just squashes the horrible bugs in 332.21, I really don't think many will find G-Sync appealing.

Or any brand new Nvidia product for that matter, their drivers just really need a solid bug-fixing release.

My Final Thoughts on G-Sync or TL;DR

G-Sync is a promise from Nvidia. It means higher quality monitors, better quality gameplay experiences, and a dedication to making gaming better. Though, in its current form it's questionable as to whether it accomplishes that or just induces levels of frustration. It's a very Beta-level product with tons of weaknesses and tons of positive points. It has a lot going for it, and a lot that needs to be improved.

The good news is that the capacity is there to add these improvements in theory. The question is whether or not Nvidia will support these needed improvements moving forward. I have my doubts, but at the winning price of "free" I was more than willing to bet my monitor on it.

Though I think the upgrade kit is too difficult to install and that most people should just wait for actual G-Sync monitors to release, or have a professional install an upgrade kit for them.

Either way, G-Sync is a diamond in the rough, an emerging technology that has plenty of room to improve into something truly wonderful. Assuming Nvidia can deliver on the promises, G-Sync has my endorsement.

My Computer Specs.

CPU: Intel i7 3770k
Cooler: Kraken X60
Mobo: Gigabyte G1.Sniper 3
RAM: 16GB (4x4GB) G.Skill (Two kits of F3-2400C10D-8GZH)
GPU: EVGA GTX 680 FTW+ 4GB w/backplate
PhysX: EVGA GTX 560 Ti 448 Core Classified Ultra
PSU: Thermaltake Toughpower Grand 1200w
Storage: 2x 1TB Western Digital Drives. Caviar Black (Primary) and Caviar Green (Storage)
Case: Cooler Master Storm Stryker
CD/DVD Burner: ASUS DRW-24B1ST

Primary Monitor: VG248QE (G-Sync from 680)
Secondary Monitors: ACER AL2216W & ACER AL1716 (from 560 Ti)

Sound Card what causes Blue Screens with new Nvidia drivers: TASCAM US-122L (Connected for XLR Microphone)

Accessories:
3D Vision 2 kit from Nvidia
Razer Onza
Razer Death Adder 2013
Razer Black Widow Ultimate Stealth Edition

Thoughts on "Free-Sync"

I find it unlikely that a large company like Nvidia with a lot of exceptionally competent engineers would spend a ton of time and money implementing something in an incredibly complicated manner that already exists. My guess is that the systems built into eDP for laptops are power saving features, intended to lover refresh cycles in environments where they aren't necessary in order to save power. For example, limiting the output of the desktop to 30Hz instead of 60Hz to save battery life.

My guess is that this technology either doesn't translate well into what Nvidia's doing, as it may not be fast enough to synchronize completely with the frame-rate output of a graphics card, or it may just not do what they need. If the two don't jive perfectly then you might get a 42 fps screen refresh displaying at 44.5621 Hz, or a similar mismatch, which would still produce either tearing or stuttering of some kind in the final image.

Also, the inclusion of triple buffering does not sit right with me. Triple Buffering, unfortunately, can mean a lot of things. More often than not it means odd buffering which can induce latency. If it's anything like the triple buffering in The Stanley Parable, you won't feel it over the maximum refresh of the monitor, but there will be really noticeable problems with mouse movement below that. Yes problems, not necessarily lag problems either. It might be that a combination of driver forced 8xSGSSAA and driver forced HBAO+ might be messing with the triple buffering algorithm Valve uses in that version of the source engine, but sometimes I feel like the mouse doesn't do what I tell it to in that game with triple buffering on.

I'm not saying AMD's implementation couldn't work, I'm sure it'll definitely end up being a "good enough" Cheap-Sync. However, monitors that support "free-sync" are not guaranteed to support a minimum feature set similar to G-Sync. At least, not yet. In the future, that may change. For now, there's really no telling.

Color me hopeful, but not expecting miracles.

Future Updates Go Here:

Let me know if there's any games, configurations, or applications you want me to try with my VG248QE. I'll make new posts for them, and link to those here if I can. I'm more than willing to spend more time with G-Sync in applications or scenarios I've already left feedback on, if you want more details from me.

I apologize deeply for any grammar or spelling or formatting errors. I swear, English is my first language, but sometimes I just miss things. I'm only human, after all, and cyborg upgrades are expensive.
 
Jigar
Maximum Gerbil
Posts: 4936
Joined: Tue Mar 07, 2006 4:00 pm
Contact:

Re: The G-Sync Experience: ULMB, 3D Vision, and 144Hz.

Thu Jan 09, 2014 6:17 am

OMG, amazing input mate, thanks a lot for posting ^ in detail.
Image
 
JohnC
Gerbil Jedi
Posts: 1924
Joined: Fri Jan 28, 2011 2:08 pm
Location: NY/NJ/FL

Re: The G-Sync Experience: ULMB, 3D Vision, and 144Hz.

Thu Jan 09, 2014 9:00 am

That was A LOT of words :o
Amazing Mr. X wrote:
I find it unlikely that a large company like Nvidia with a lot of exceptionally competent engineers would spend a ton of time and money implementing something in an incredibly complicated manner that already exists.

I agree.

Amazing Mr. X wrote:
3D in movies and in television is largely a gimmick designed to sell more useless products to consumers. HOWEVER, Nvidia's 3D Vision is not just a gimmick.

I've actually experienced both of these, using active glasses, and I completely disagree. I like watching movies like Avatar with well-implemented 3D effects, whereas using such gimmicks in current games does absolutely nothing to enhance my enjoyment or improve my performance in them. Maybe in future when (in an addition to VR goggles) full VR suits with treadmills will be widely available and every game will require them I might change my opinion, but right now the Nvidia's 3D Vision is still an unnecessary gimmick designed to attract some simple-minded sheeple :wink:
Gifter of Nvidia Titans and countless Twitch donation extraordinaire, nothing makes me more happy in life than randomly helping random people
 
chuckula
Minister of Gerbil Affairs
Posts: 2109
Joined: Wed Jan 23, 2008 9:18 pm
Location: Probably where I don't belong.

Re: The G-Sync Experience: ULMB, 3D Vision, and 144Hz.

Thu Jan 09, 2014 9:25 am

Thanks for the detailed writeup! From what you wrote, and from a lot of your complaints, one thing about G-sync is very clear: It's obviously at the beta-test quality/simplicity level right now. A bunch of the issues you had (like even having to hack a monitor in the first place) are due to the fact that G-sync is obviously not 100% mature yet and Nvidia still has a bunch of polishing to do before it is a refined feature that people can use without too much hassle.
4770K @ 4.7 GHz; 32GB DDR3-2133; Officially RX-560... that's right AMD you shills!; 512GB 840 Pro (2x); Fractal Define XL-R2; NZXT Kraken-X60
--Many thanks to the TR Forum for advice in getting it built.
 
superjawes
Minister of Gerbil Affairs
Posts: 2475
Joined: Thu May 28, 2009 9:49 am

Re: The G-Sync Experience: ULMB, 3D Vision, and 144Hz.

Thu Jan 09, 2014 9:38 am

chuckula wrote:
Thanks for the detailed writeup! From what you wrote, and from a lot of your complaints, one thing about G-sync is very clear: It's obviously at the beta-test quality/simplicity level right now. A bunch of the issues you had (like even having to hack a monitor in the first place) are due to the fact that G-sync is obviously not 100% mature yet and Nvidia still has a bunch of polishing to do before it is a refined feature that people can use without too much hassle.

Bingo. This kind of change is going to take work (especially to get other technologies and ideas to work with it), and it also needs people to see the benefits in order to spread the word about the technology. My hope is that adaptive refreshing will come to all displays in some form. And yes, the upgrade kit will go away fairly quickly as soon as the design is complete.

Next up, I am fairly certain that this thread will be merged with this one. I think they're trying to keep winner impressions in one place.

Thanks for the detailed write up, by the way. You should probably have it reviewed a couple times and see if you can get it published on a blog or something. It was fun to read, and helps cover a lot of bases.
On second thought, let's not go to TechReport. It's infested by crypto bull****.
 
Prestige Worldwide
Gerbil Elite
Posts: 765
Joined: Mon Nov 09, 2009 10:57 pm

Re: The G-Sync Experience: ULMB, 3D Vision, and 144Hz.

Thu Jan 09, 2014 11:37 am

Amazing Mr. X wrote:
Though, there's a problem with using Last Light as the poster boy for why G-Sync is a good idea. Last Light has some of the worst problems with stutter, tearing, and animation quality in the industry. This hardly makes the game fair comparison material, as there are many games with significantly cleaner and better animation quality and less stuttering issues. I'll be testing other games more thoroughly in the future, but I've noticed that F.E.A.R doesn't necessarily benefit in any noticeable way from using G-Sync.


I think as far as FEAR goes, I can understand GSync not having a very noticeable impact. Things are already going to be extremely fluid since you are:

- Playing a game from 2005 on 3770k and GTX 680
- Have a 144Hz Screen
- Probably using fpslimiter.exe or DXTory to lock your game at 85-100 fps since that is what most FEAR 'pros' consider to be the sweet spot
- Likely playing at a lower resolution in the 800x600 to 1024x768 range if you are a competitive FEAR player


I played FEAR multiplayer at 960x540 100hz (custom resolution in nvidia control panel), 100fps frame limit using DXtory, vsync OFF with a BenQ XL2420t 120hz monitor.

The first time I played FEAR with a 120hz monitor, it was so smooth it was unbeliveable. RMAing my 1st XL2420t due to stuck pixels and using a 60hz Samsung while I waited was like playing in slow motion.

Do you still play FEAR today? Every few months I open the server browser and just see MXT CTF with 1.5 runspeed and awful custom maps being the only populated server and ALT-F4 out. I miss the older "pure" competitive setting servers running league settings, 1.2-1.3 runspeed and stock maps.
8700k@5GHz, Custom Water Loop | ASRock Fatal1ty Gaming K6 | 32GB DDR4 3200 CL16
RTX 3080 | LG 27GL850 144Hz | WD SN750 1TB| MX500 1TB | 2x2TB HDD | Win 10 Pro x64
X-Fi Titanium Fatal1ty Pro | Sennheiser HD555 | Seasonic SSR-850FX | Fractal Arc Midi R2
 
Ryu Connor
Global Moderator
Posts: 4369
Joined: Thu Dec 27, 2001 7:00 pm
Location: Marietta, GA
Contact:

Re: G-Sync Impressions

Thu Jan 09, 2014 12:49 pm

Thread merged.

Thanks for all the contributions.
All of my written content here on TR does not represent or reflect the views of my employer or any reasonable human being. All content and actions are my own.
 
Techgoudy
Gerbil First Class
Posts: 142
Joined: Tue Oct 02, 2012 5:01 pm

Re: G-Sync Impressions

Thu Jan 09, 2014 2:43 pm

Nicely done guys, extra kudos go out to Amazing Mr. X for the extremely easy to read, but yet still very detailed writeup. Someone give this guy a job doing this stuff. I can honestly say I really enjoyed reading your writeup and that's a hard thing for me to say because I usually wont read an entire post that long.

Anyway, from what I can tell this technology is extremely fascinating, yet too many quirks in it for me buy one just yet. I believe that technology like this will help pave the future for a whole new set of highly optimized gaming monitors. The only issue I have with it so far is that Nvidia plans to keep it to themselves for the most part, which in one way is great for the business, but horrible for the consumer. In my opinion technology like this should be widespread and done on all platforms.
 
Aphasia
Grand Gerbil Poohbah
Posts: 3710
Joined: Tue Jan 01, 2002 7:00 pm
Location: Solna/Sweden
Contact:

Re: G-Sync Impressions

Fri Jul 10, 2015 2:26 pm

If this is going to continue to be a sticky, I think we should continue to add to it. Especially now when we are starting to have good comparisons out between G-Sync and Freesync.
Also, really good write up from Amazing X.

While the G-Sync seems to be just a tad technically better, in most respects, it's not that much better that it matters in many respects. I think the biggest thing with G-sync is actually adaptive syncing in conjunction with ULMB. I do have one issue, G-Sync wasn't developed purely because Adaptive Sync isn't enough, while the engineers might have worked on it from the a technical standpoint, Nvidia as a company, developed it for one purpose... to make more money. Which is the reason its proprietary, it's also probably the reason, it doesn't seem to be fabbed by license but requires extra hardware that are bought from nvidia.

Now, my take is that Freesync is good enough in just about most respects except for the top 1-2% of the users, and it's done in a non proprietary way supposed to be a standard, included in the readily available driver chips. And that is something to be applauded.

All in all, adaptive refresh rates is a really good future for gaming no matter which way you go.
 
rahulahl
Gerbil Team Leader
Posts: 256
Joined: Thu Aug 16, 2012 8:57 am
Location: Australia

Re: G-Sync Impressions

Sat Jul 11, 2015 12:20 am

Aphasia wrote:
If this is going to continue to be a sticky, I think we should continue to add to it. Especially now when we are starting to have good comparisons out between G-Sync and Freesync.
Also, really good write up from Amazing X.

While the G-Sync seems to be just a tad technically better, in most respects, it's not that much better that it matters in many respects. I think the biggest thing with G-sync is actually adaptive syncing in conjunction with ULMB. I do have one issue, G-Sync wasn't developed purely because Adaptive Sync isn't enough, while the engineers might have worked on it from the a technical standpoint, Nvidia as a company, developed it for one purpose... to make more money. Which is the reason its proprietary, it's also probably the reason, it doesn't seem to be fabbed by license but requires extra hardware that are bought from nvidia.

Now, my take is that Freesync is good enough in just about most respects except for the top 1-2% of the users, and it's done in a non proprietary way supposed to be a standard, included in the readily available driver chips. And that is something to be applauded.

All in all, adaptive refresh rates is a really good future for gaming no matter which way you go.

As it stands Freesync does not work in certain situations. When the FPS is above or below its threshold, which seems to be very small. So when you actually want the adaptive sync to work, unless you fall within the very specific parameters when it comes to frame rate, it simply will not work. Gsync has no such issues. The threshold range is a lot bigger than Freesync so far, that its extremely unlikely that someone will find it too restrictive.

Another main point is that Gsync works in windowed mode. A lot of applications simply do not have a full screen mode. In these circumstances, Gsync will work while Freesync will not.
Even when there is an option to choose between windowed mode and full screen mode, most people would like to choose windowed mode if not for the impact on the performance. Gsync fixes this impact so much that now my games in borderless mode perform identically to full screen mode. At least for me personally, this has been a noteworthy change.

So in my books at least, while I would certainly like Freesync to be as good as Gsync, the reality is that it is simply too lacking. Gsync for now is still far better option than Freesync if you do not mind being locked to Nvidia. Personally I was going to go Nvidia simply because they typically have better 99th frame percentile results. The only thing that changes now is that if I ever want to go back to AMD, I will have to sell my monitor and get a new one, which hopefully by that point will be more open standard and compatable with everything. But as it stands, I am very happy with my purchase, and would gladly recommend the product to my friends.
Intel i7 4790K
MAXIMUS VII RANGER motherboard
EVGA GTX 1080
16GB Ram
550Watt Seasonic PSU
Asus Rog Swift PG278Q
Windows 10
 
Aphasia
Grand Gerbil Poohbah
Posts: 3710
Joined: Tue Jan 01, 2002 7:00 pm
Location: Solna/Sweden
Contact:

Re: G-Sync Impressions

Sat Jul 11, 2015 10:51 am

I think saying that anyone would choose such and such settings is something to be taken very lightly, because I don't know anyone that uses full screen window mode if they can avoid it basically. With seamless alt-tabbing, what do you see as a point for full-screen windowed mode compared to just full screen? For the people that use window mode, sure, for the rest, not so much. As for a lot of applications, I have yet to see one that doesn't, it usually is the other way around when you are using windowed mode depending if you do as me and change the place of the taskbar to non-standard, etc.

But true, since I don't use it, and don't know anyone that does, I hadn't even considered windowed mode as a point.

As for the range, that is up to the monitor makers to see what they can get out of the panels, Freesync by itself don't have a limit unless you count 9-240Hz as limited. Now then, on my MG279Q Asus has chosen to use 35-90 for it's supported range, and that is a limit, I would've liked to have it up to 120 at least. But the reports say they could not get the IPS panel working correctly that high. So that is not a Freesync problem, probably more of a market problem with the chip driver having the tech "for free". And currently, the choices for IPS fast gaming panels are quite limited by itself. Now the new Acer X34 3840x1440 IPS panels uses a 30-75Hz of the panel for it's Freesync. And you do have the ability to use V-Sync outside of the Freesync range.

That said, in BF4 which I play the most, I can just punch everything up to Ultra setting on 2560x1440, which is about what my single card can do with a steady 90fps, then I limit the game engine to max out at 89Hz. And everything is buttery smooth with V-sync off. And I have yet to see it drop below 35 fps, either. Now, had it supported 120Hz, I would be even more happy, because I could get close to that on High Settings, but I would suffer slowdowns which would force me to use V-Sync.

But the basic premise, turning being able to turn off V-sync and having a smooth experience free from tearing and any V-sync slow down jitter is probably good enough, especially for being basically a free technology, which is also why Nvidia will probably never support it, it doesn't give them any extra inflow of cash.

I would love to see the following scenario play out, and no it will never happen.
* Nvidia starts support Adaptive-Sync as well to support all Adaptive Sync monitors. After all, this should be a driver update in most respects.
* Nvidia continue to sell and develop G-Sync and sell G-Sync monitors as well.
Now, how much market share would G-Sync continue to have in this scenario, and how many people would find it to worth the $250 difference... (Asus Predator G-SYNC TN-panel vs. Asus Predator Freesync IPS right now in my current store)

But then, I guess that is what the price premium get's. You get a forced minimum quality that costs a bunch of extra $, and you don't have to shop around and looking for how it has been implemented on that specific monitor. On the other hand, you have the price premium and market lock in.

Which brings me to the last point, don't mind being locked in, that is precisely what I mind. That is the reason we are now down to two choices of graphics in the market at all and are limited in our choices in how we wan't to run things. While it's good business for Nvidia, it's very anti-consumer, but that is for the R&P section really. That's something that can be discussed to death and shouldn't be mixed with the technical merits.

In reality, we can probably discuss the differences for how long we want to, my last post I made mainly, not to incite any kind of comparison, but as to actually have a reference point in the thread for Amazing X's write up now when there are Freesync panels in circulation. Back then it was partly guesswork, now we have the answer to how it really works. And the basic premise is... within the normal range, more then well enough for what is probably a pretty decent chunk of the market. Now if you are talking specific needs, no, it doesn't seem to be as good as G-Sync, but on the other hand it doesn't cost a price premium.

I used Nvidia for a good many years when the first Geforce came out, after leaving ATI and Matrox behind, then I actually switched back to ATI a bunch of years later when ATI was very competitive again. And if I thought the price/performance was the best, I would probably switch back to Nvidia despite the lock in things, but currently, those together is less perceived value for me.
 
NoOne ButMe
Gerbil Elite
Posts: 707
Joined: Fri May 15, 2015 9:31 pm

Re: G-Sync Impressions

Sat Jan 16, 2016 8:12 pm

rahulahl wrote:
[As it stands Freesync does not work in certain situations. When the FPS is above or below its threshold, which seems to be very small. So when you actually want the adaptive sync to work, unless you fall within the very specific parameters when it comes to frame rate, it simply will not work. Gsync has no such issues. The threshold range is a lot bigger than Freesync so far, that its extremely unlikely that someone will find it too restrictive.

As it stands G-sync does not work in certain situations. When the FPS is above or below it's threshold. So what you actually want the adaptive sync to work unless you wall with it's parameters when it comes to framerate it simply does not work.

To put it simply, Freesync range for most people is effectively as good as Gsync range. That's the problem with Gsync, for the main reason of using it, gaming, Freesync is just as good in over 99 percent of all scenarios. nintey-nine-point-nine percent is probably more accurate.


Another main point is that Gsync works in windowed mode. A lot of applications simply do not have a full screen mode. In these circumstances, Gsync will work while Freesync will not.
Even when there is an option to choose between windowed mode and full screen mode, most people would like to choose windowed mode if not for the impact on the performance. Gsync fixes this impact so much that now my games in borderless mode perform identically to full screen mode. At least for me personally, this has been a noteworthy change.

From what world do you get this from? When it comes to people doing youtube videos and a very small minority outside that maybe. If you're doing a video for youtube neither solution is going to help with all the problems youtube will cause to the video. So it's only the small minority that has this.

This change was very good. More features is better for the consumer.

So in my books at least, while I would certainly like Freesync to be as good as Gsync, the reality is that it is simply too lacking. Gsync for now is still far better option than Freesync if you do not mind being locked to Nvidia. Personally I was going to go Nvidia simply because they typically have better 99th frame percentile results. The only thing that changes now is that if I ever want to go back to AMD, I will have to sell my monitor and get a new one, which hopefully by that point will be more open standard and compatable with everything. But as it stands, I am very happy with my purchase, and would gladly recommend the product to my friends.
[/quote]
I think everyone wants Freesync to be as good as Gsync fully. For almost every other user besides yourself Freesync is effectively as good as Gsync.
For most people I will declare Gsync a worse option as most people would benefit from an extra $250 into a faster CPU, GPU or an SSD than into doing-things-around-the-edges-better.
currently running: Clevo W230SD, i7-4710MQ, 1TB SSD, 960m, win10 Pro + 1 HP Pavilion 22XI monitor and sometimes a 1080p 32" Vizio TV.
 
Prestige Worldwide
Gerbil Elite
Posts: 765
Joined: Mon Nov 09, 2009 10:57 pm

Re: G-Sync Impressions

Sun Jan 17, 2016 7:55 pm

Sick necro, bro
8700k@5GHz, Custom Water Loop | ASRock Fatal1ty Gaming K6 | 32GB DDR4 3200 CL16
RTX 3080 | LG 27GL850 144Hz | WD SN750 1TB| MX500 1TB | 2x2TB HDD | Win 10 Pro x64
X-Fi Titanium Fatal1ty Pro | Sennheiser HD555 | Seasonic SSR-850FX | Fractal Arc Midi R2
 
End User
Minister of Gerbil Affairs
Posts: 2977
Joined: Fri Apr 16, 2004 6:47 pm
Location: Upper Canada

Re: G-Sync Impressions

Sat Jun 25, 2016 10:11 am

I'm very happy with my new G-Sync setup (GTX 1080 + XB271HU). Games run smooth as silk.
 
Demetri
Gerbil
Posts: 45
Joined: Sat Aug 23, 2014 11:48 am

Re: G-Sync Impressions

Sun Aug 21, 2016 4:32 pm

Some thoughts on adaptive sync now that I've got a Freesync capable card and tested it on 3 different monitors: Samsung S24E370DL (IPS/PLS panel, 24-inch, Freesync), Eizo FG2421 (VA panel, 24-inch, 120hz + strobing backlight), and my old trusty HP w1907 (TN, 19-inch).

Most of the testing was done in Overwatch. Starting with the Samsung... Freesync really shines if you can't hold a consistent framerate equal to the monitor's refresh. The game looks like garbage when capped @ 55 fps. Simply flipping the Freesync switch clears all that tearing and stuttering up. The thing is, if you can simply hold 60 fps, that's not much of an issue anyway. What I found kind of strange is that I still had a bit of tearing when capped @ 60, even with Freesync on, so it actually looked slightly better when capped at 55. Maybe someone smart can tell me why that is. This monitor can also do 75 hz, and in my opinion, that's the best way to run it, even without Freesync, assuming you have enough juice to maintain the framerate.

With the Eizo, the jump to 120hz was a blatantly significant improvement in motion clarity. Turning on the strobing backlight was an additional significant jump. Unfortunately the panel comes with drawbacks. The color can't match the Samsung and HP, and the viewing angles are really poor. Even if you're viewing it perfectly centered, the edges still look more washed out compared to the middle. This monitor is going up for sale, but 120hz is definitely a must have for my next monitor. I could live without the strobing. One thing to note with this monitor is that the strobing doesn't work properly @ 60hz. You can turn it on, but there are a bunch of weird graphical glitches.

As for adaptive-sync in general, I don't look at it as essential anymore. I'm just going to get a 120hz monitor and the graphics card to feed it. Will probably pick up a used 1070, or maybe Vega if it's good. Next monitor to try... I just purchased a used Samsung S23A950D off Ebay. These were glossy 120hz TN panels. I'd love to have a panel with strobing backlight, but there aren't many options. BenQ z-series could be good, but I hear they have a somewhat heavy AG coating which is a no-go. There's also the Eizo FS2735, but it's 1440p, which makes it a lot more difficult to crank out 120 fps.
 
End User
Minister of Gerbil Affairs
Posts: 2977
Joined: Fri Apr 16, 2004 6:47 pm
Location: Upper Canada

Re: G-Sync Impressions

Sun Aug 21, 2016 5:57 pm

Demetri wrote:
As for adaptive-sync in general, I don't look at it as essential anymore.

Demetri wrote:
There's also the Eizo FS2735, but it's 1440p, which makes it a lot more difficult to crank out 120 fps.

I wonder what would help with that?
 
biffzinker
Gerbil Jedi
Posts: 1998
Joined: Tue Mar 21, 2006 3:53 pm
Location: AK, USA

Re: G-Sync Impressions

Sun Aug 21, 2016 6:17 pm

End User wrote:
Demetri wrote:
As for adaptive-sync in general, I don't look at it as essential anymore.

Demetri wrote:
There's also the Eizo FS2735, but it's 1440p, which makes it a lot more difficult to crank out 120 fps.

I wonder what would help with that?

Adaptive Sync?
End User (always trying to be clever.) :P
It would take you 2,363 continuous hours or 98 days,11 hours, and 35 minutes of gameplay to complete your Steam library.
In this time you could travel to Venus one time.
 
Demetri
Gerbil
Posts: 45
Joined: Sat Aug 23, 2014 11:48 am

Re: G-Sync Impressions

Sun Aug 21, 2016 6:20 pm

Adaptive-sync does nothing for motion blur, which is the main reason for jumping to 120hz. Also, it's incompatible with backlight strobing.
 
End User
Minister of Gerbil Affairs
Posts: 2977
Joined: Fri Apr 16, 2004 6:47 pm
Location: Upper Canada

Re: G-Sync Impressions

Sun Aug 21, 2016 6:52 pm

Demetri wrote:
Adaptive-sync does nothing for motion blur, which is the main reason for jumping to 120hz. Also, it's incompatible with backlight strobing.

I've enable ULMB mode all of 0 times on my G-SYNC display. The elimination of screen tearing drumpfs motion blur for me when playing at 2560x1440.
 
Voldenuit
Minister of Gerbil Affairs
Posts: 2888
Joined: Sat Sep 03, 2005 11:10 pm

Re: G-Sync Impressions

Sun Sep 11, 2016 3:56 pm

End User wrote:
Demetri wrote:
Adaptive-sync does nothing for motion blur, which is the main reason for jumping to 120hz. Also, it's incompatible with backlight strobing.

I've enable ULMB mode all of 0 times on my G-SYNC display.  The elimination of screen tearing drumpfs motion blur for me when playing at 2560x1440.

I actually switched from G-Sync to ULMB for Overwatch @1440p on my 1070.
Yes, I see stutter again (even at 120 Hz, ~120 fps), but the added crispness made it easier to identify player sillhouettes during quick flicks.
I still use G-Sync on a bunch of other games (primarily slower-paced single person games); it's nice to have the option between them though.
Wind, Sand and Stars.
 
moshpit
Gerbil
Posts: 88
Joined: Wed Jan 04, 2006 1:24 am

Re: G-Sync Impressions

Fri Apr 28, 2017 12:02 pm

I have to say my impression of Gsync is simply amazed. I'm blown away by it. Pairing a ViewSonic XG-2703-GS monitor with a 1080Ti, having come from a GTX 680 paired with an older HP 25" 60hz vanilla TN panel, the change is both dramatic and a little difficult to fully grasp. There is a level of smoothness to gaming now that feels almost surreal. An almost unnatural level of responsiveness that takes some actual getting used to! It's not a negative though, even if it is a little disconcerting at first. Things like twitch reflexes in FPS situations are extremely easy to overcompensate when it's this fluidly fast. A tiny movement of the mouse sends the screen spinning, smoothly, but FAST in a complete 180 degree spin! In a game like Fallout 4 this results in being able to spin completely around to face a target behind you so fast that it almost hurts the brain to keep up with the motion and keep it controlled. It took me about 10 minutes to master this new level of smooth. Now I don't see how I ever lived without it...

Fallout 4 is only one example of Gsync blowing my mind since building this setup. In both benchmarks and other games, the visuals are on a totally new experience level for me. I've been a PC gamer for 25 years now, and no rendering tech since Voodoo (1) Graphics has given such a "wow!" factor for me. I cannot overstate how different the experience is, though honestly have a hard time quantifying the exact effect itself beyond "hyper-smooth" being the only term I can give it. Words fall short.
Ci7 7700K/32Gb Corsair DDR4 3200/Asus Z270 TUF Mark 1/PNY RTX 3090 24Gb XLR8/512Gb Samsung 960 Pro M2/Toshiba X300 6Tb/EVGA CLC 240 AIO/Corsair Carbide 400C/Corsair HX850i Platinum
 
Pville_Piper
Gerbil XP
Posts: 347
Joined: Tue Dec 25, 2012 2:36 pm
Location: Pville...

Re: G-Sync Impressions

Wed Aug 16, 2017 4:11 pm

Bit of a necro but...

I've been using a G-Sync 144hz monitor for over a year and I love it. In BF4 it helped me increase my KDR by about .25%. The clarity of the image was a big factor and running +100 fps was really smooth.

I don't play FPS shooters as much any more so I would love to step up to a larger monitor to play Elite Dangerous on. It is hard for me to justify the cost, especially with the Rift currently running at 399.

Anyway, a video card upgrade is next, but that will have to wait. I was so hopeful that Vega would help moderate prices but I don't see that happening now.
Windows10, EVGA G2 750w Power Supply, Acer XB270H G-synch monitor, MSI Krait Gaming 3X, I7 6700K, 16 gigs of CORSAIR Vengeance LPX DDR4 3200 MHz ram, Crucial 500 gig SSD, EVGA GTX1080 FTW

Who is online

Users browsing this forum: No registered users and 1 guest
GZIP: On