60Hz, in situations where your graphics card can't output frames at a contant 60fps:with Vsync on, your GPU finishes drawing a frame to the buffer ready for the monitor, and the monitor has to wait until it's next refresh (up to 1/60th of a second = 16.6ms) On average, that means you're getting an additional 8.3ms of lag at 60Hz (since, the delay will be somewhere between 0 and 16.6ms). In this example where the GPU can't fill the buffer fast enough for the 16.6ms refresh period, you will have to wait
an additional 16.6ms, which means that you're actually averaging 25ms of input lag (16.6+8.3) and frame intervals of up to 33.3ms (1/30th of a second).
Now, it varies from person to person, but 1/30th of a second is too long to fool the brain into perceiving motion. It's still watchable at 30fps (movies are 24fps) but the brain struggles to fill in the gaps beween seperate images if there's too much motion and the difference between the two images is too great. "Fluidity" for most people is somewhere between 35Hz (29ms) and 50Hz (20ms). "Fluidity" as I'm calling it is NOT the same thing as the maximum refresh rate your eyes/brain can process. For me at least, that number is about three times greater.
120Hz, in situations where your graphics card can't output frames at a contant 60fps:Your GPU finishes drawing a frame to the buffer ready for the monitor, and the monitor has to wait until it's next refresh (up to 1/120th of a second = 8.3ms) The improved refresh rate means you're averaging 4.2ms of lag from the refresh interval alone, and another 8.3ms if the GPU skips a frame because it can't keep up - bringing the average input lag down to 12.5ms (8.3+4.2).
Here is where it gets funky; At exactly 60fps, we're missing every other refresh to deliver frames every 16.6ms but at
less than a constant 60fps the frame takes
more than 16.6ms. In fact, the GPU has up to 25ms to draw to the buffer before missing the refresh for a second time, rather than 33.3ms before the second refresh when running the monitor at only 60Hz. the shorter interval between second refresh reduces the missed-interval penalty by 8.3ms, and bringing the longest frame interval down to 25ms, which is important because it's the number that falls comfortably within the 35-50Hz range that most people consider to be fluid.
G-Sync, in situations where your graphics card can't output frames at a contant 60fps:Your GPU finishes drawing a frame to the buffer ready for the monitor, and the monitor doesn't have to wait at all. The input lag in the system is entirely the game engine and the rendering time, which will always be present, plus any input lag the display processing the image causes. These types of lag are always present though and the context we're talking about -
lag caused by vsync and refresh intervals- the value is zero. NO LAG. Using the median 25ms value that can be considered "fluid" by most people, the graphics card can deliver framerates as low as 40fps and still look good, still with ZERO LAG.
G-Sync in a nutshell:Well, even at <60fps G-Sync does two things: It reduces the input lag, and increases the amount of time a GPU can spend rendering a frame before frames exceed the 25ms threshold that we (I, at least) don't like. It brings the fluidity benefits of a 120Hz display to sub-60fps gaming and it manages to reduce imput lag to the same level as without vsync. Playing with Vsync turned off, but cranking detail levels high is like an exercise in pointlessness. Why go to all the effort of making it look pretty if the monitor rips the dolled-up frame clean in two the minute there's any motion?
Nvidia have worked it out (which is annoying, because Nvidia like to keep things to themselves rather than drive open standards for the industry): We don't need ludicrious 120, 144 or even 240Hz monitors. What we need is as much detail as possible at a framerate that's fluid. Given the choice of gaming with 'ultra settings' at 50Hz (a
constant 50fps) or 'medium settings' at 144Hz, I'd definitely go with the higher detail at the lower framerate. Once a game
looks fluid, cranking out the frames faster is wasted effort that could be better spent on making the frames prettier/more detailed.
Off topic a bit:I have actually measured the rate at which I perceive fluidity and it's 43Hz for me. I won't bore you with the science of how that was calculated, but it's important to know that it is an average; The brain takes longer to process and low-contrast and darker images, In those instances, 30Hz can seem fluid. In candyland-style brightly contrasting images, 60Hz may not be enough. This is arguably why flicker-free monitors required at least 72Hz. The strobing CRT scan was effectively a very bright, very high contrast white band moving down the screen - the worst case scenario of bright and contrast-y requiring higher frequencies to fool the brain. I digress though, I am very happy with my screen running at 85Hz. For the same reason as a 120Hz screen, it brings the dropped-frame interval from 33,3ms down to just 23.5ms - and that happens to be about 43Hz, just about enough to fool my brain. It's obviously not a constant 60fps, but it's better than the yo-yoing between 60fps and 30fps every few frames.
Oh, and congratulations if you read this far. I think I've worked off my foolish alcohol-fueled insomnia, which was the main point of this post