G-Sync Monitors?

What you see is what you get, including photography, displays, and video equipment.

Moderators: Dposcorp, SpotTheCat

Re: G-Sync Monitors?

Postposted on Sun Oct 20, 2013 11:36 pm

John Carmack on G-Sync and Lightboost:

https://www.youtube.com/watch?v=gbW9IwVGpX8
Intel Core i7-875K, Asus P7P55D-E Pro, Win 7 Home Premium
MSI GTX 560 Ti OC, Mushkin 2x2GB DDR3-1333, Corsair TX650
Cooler Master Hyper 212+, Logitech Z-2300, ASUS Xonar DX
Samsung Spinpoint F3 1TB, Dell Ultrasharp U2410, Antec P183
DeadOfKnight
Gerbil Elite
Gold subscriber
 
 
Posts: 634
Joined: Tue May 18, 2010 12:20 pm

Re: G-Sync Monitors?

Postposted on Mon Oct 21, 2013 8:20 am

Thanks for confirming that Lightboost fixes only sample and hold blur, Auxy.

Since this is a thread about V-sync technologies, we don't need yet another Auxy-vs-The-World Lightboost crusade, there are already dozens of those threads scattered all over the forums.

Yeah, Lightboost is good. but it's also irrelevant and off-topic here.
<insert large, flashing, epileptic-fit-inducing signature (based on the latest internet-meme) here>
Chrispy_
Graphmaster Gerbil
Gold subscriber
 
 
Posts: 1462
Joined: Fri Apr 09, 2004 2:49 pm

Re: G-Sync Monitors?

Postposted on Mon Oct 21, 2013 9:02 am

Chrispy_ wrote:Thanks for confirming that Lightboost fixes only sample and hold blur, Auxy.

Since this is a thread about V-sync technologies, we don't need yet another Auxy-vs-The-World Lightboost crusade, there are already dozens of those threads scattered all over the forums.

Yeah, Lightboost is good. but it's also irrelevant and off-topic here.

You're a jerk. I mentioned it because if G-Sync and Lightboost are incompatible, then I think G-Sync is non-starter for hardcore gamers, which are the only ones likely to pick up the niche hardware required at this time. So, I'm saying it will have to become more available to have any chance of success.
i5-3570K @ 4.4 (NH-C14), 4x8GB DDR3-1866, GA-Z68MA-D3H-B2, ASUS GTXTITAN-6GD5, 128GB Vertex 4 / 2x60GB Vertex Plus R2 / 2x2TB Barracuda 7200.14 RAID0 / ANS-9010 (4x4GB), SST-DA1000 (PSU), 2x VS229H-P, 1x VG248QE, 1x MIMO 720F, Corsair Vengeance K90+M95
auxy
Gerbil Elite
 
Posts: 779
Joined: Sat Jan 19, 2013 3:25 pm
Location: the armpit of Texas

Re: G-Sync Monitors?

Postposted on Mon Oct 21, 2013 10:01 am

auxy wrote:
Chrispy_ wrote:Thanks for confirming that Lightboost fixes only sample and hold blur, Auxy.

Since this is a thread about V-sync technologies, we don't need yet another Auxy-vs-The-World Lightboost crusade, there are already dozens of those threads scattered all over the forums.

Yeah, Lightboost is good. but it's also irrelevant and off-topic here.

You're a jerk. I mentioned it because if G-Sync and Lightboost are incompatible, then I think G-Sync is non-starter for hardcore gamers, which are the only ones likely to pick up the niche hardware required at this time. So, I'm saying it will have to become more available to have any chance of success.

I'm going have to play devil's advocate here and say that I'd much rather deal with a bit of blur than relentless tearing. Of course, I could then just manage with adaptive V-Sync and Lightboost to address both, but I think I'm in a minority here when I say I'd rather deal with occasional stutter and lag than relentless tearing. Therefore, this may very well have more appeal to more folks than Lightboost currently does. These may all become compatible in the future though. Besides, to my knowledge the current implementation of Lightboost isn't really a feature in the way that you suggest without some exploitative tweaks that users may or may not be comfortable with. Backlight strobing doesn't have mainstream support yet.

The whole point is moot though if these technologies don't make their way to IPS panels because beautiful colors, high resolutions, and great viewing angles beat the crap out of responsiveness for all but competitive multiplayer gaming and some of the new VR stuff, IMO. The only reason I'm even considering this is because I want a second monitor, and it does make sense to get one that has a set of strengths for usage scenarios that aren't already covered by my current display.
Intel Core i7-875K, Asus P7P55D-E Pro, Win 7 Home Premium
MSI GTX 560 Ti OC, Mushkin 2x2GB DDR3-1333, Corsair TX650
Cooler Master Hyper 212+, Logitech Z-2300, ASUS Xonar DX
Samsung Spinpoint F3 1TB, Dell Ultrasharp U2410, Antec P183
DeadOfKnight
Gerbil Elite
Gold subscriber
 
 
Posts: 634
Joined: Tue May 18, 2010 12:20 pm

Re: G-Sync Monitors?

Postposted on Mon Oct 21, 2013 10:49 am

auxy wrote:You're a jerk.


Why do you always do this. There's no need to be so aggressive in a discussion, and there's no need to force everyone to think the exact way you think.
I'm just trying to keep this thread on topic, which is about vsync technology. Maybe I called this one too early but there are enough threads discussing lag/latency/120Hz/TN-vs-IPS where it turns into an argument between you and everyone else in the thread.
<insert large, flashing, epileptic-fit-inducing signature (based on the latest internet-meme) here>
Chrispy_
Graphmaster Gerbil
Gold subscriber
 
 
Posts: 1462
Joined: Fri Apr 09, 2004 2:49 pm

Re: G-Sync Monitors?

Postposted on Mon Oct 21, 2013 11:29 am

Chrispy_ wrote:
auxy wrote:You're a jerk.


Why do you always do this. There's no need to be so aggressive in a discussion, and there's no need to force everyone to think the exact way you think.
I'm just trying to keep this thread on topic, which is about vsync technology. Maybe I called this one too early but there are enough threads discussing lag/latency/120Hz/TN-vs-IPS where it turns into an argument between you and everyone else in the thread.

I'd appreciate it if you both kept these personal jabs to yourselves. Nobody likes having their thread locked because other people can't behave themselves.
Intel Core i7-875K, Asus P7P55D-E Pro, Win 7 Home Premium
MSI GTX 560 Ti OC, Mushkin 2x2GB DDR3-1333, Corsair TX650
Cooler Master Hyper 212+, Logitech Z-2300, ASUS Xonar DX
Samsung Spinpoint F3 1TB, Dell Ultrasharp U2410, Antec P183
DeadOfKnight
Gerbil Elite
Gold subscriber
 
 
Posts: 634
Joined: Tue May 18, 2010 12:20 pm

Re: G-Sync Monitors?

Postposted on Mon Oct 21, 2013 11:39 am

DeadOfKnight wrote:I'd appreciate it if you both kept these personal jabs to yourselves. Nobody likes having their thread locked because other people can't behave themselves.


Indeed.

Let it go folks.
"Welcome back my friends to the show that never ends. We're so glad you could attend. Come inside! Come inside!"
Ryu Connor
Global Moderator
Gold subscriber
 
 
Posts: 3450
Joined: Thu Dec 27, 2001 6:00 pm
Location: Marietta, GA

Re: G-Sync Monitors?

Postposted on Thu Oct 24, 2013 11:38 am

Since 2D strobing is an nVidia officially sanctioned optional feature of G-SYNC monitors:
Chrispy_ wrote:Even a "slow" 6ms panel exhibits less visible ghosting than the 1-frame blur of sample-and-hold.
Just to clarify, both is cumilative.
(I assume you meant it that way, but this can be easily misunderstood, so this needs further expansion)

Hold/persistence and transitions, overlaps each other, to accumulate motion blur. That GtG is adding more blur above-and-beyond sample-and-hold. So you're actually combining getting 6ms of addition blurring/ghosting above-and-beyond a flickerfree frame length (e.g. 16.7ms), aka persistence. A lot of people interchange ghosting/motion blur, though they are visually related (ghosting is simply asymmetric motion blur, more blurry on one edge than the opposite edge, often because pixel transitions was slower in one direction than the other), but the key is that any of the above affects motion clarity...

Persistence (visible time) and GtG transitions (pixel changing state) are two somewhat different things, though they certainly blend/overlap into each other, especially on older LCD's the end up being in perpetual transition without much stable 'hold'. So it doesn't matter; either creates blurring and/or ghosting. So both scenarios are a problem:
-- Eliminating stationary pixel color (sample-and-hold) by keeping pixels in perpetual transition creates persistence. (The transitions add to persistence.)
-- Adding stationary pixel color (sample-and-hold) by making pixel transition instantaneous, still means persistence. (but yes, less so, as you say, as transitions are now less than a refresh cycle nowadays, but the whole refresh cycle is persistence).

Both approaches are high persistence, since creating longer pixel transitions (to prevent static hold) never, ever shortens persistence, and never reduces motion blur. All flickerfree displays of today's typical refresh rates (60Hz, 120Hz) all produce motion blur, no matter how infinitely fast GtG transitions are. Mathematically, GtG blends with persistence.

You can never have 6ms of visible GtG and 1ms of visible persistence, since visible GtG adds to visible persistence.
As a result, all flickerfree 60Hz LCD's with a PWM-free, steady state backlight, are guaranteed to have at least ~16.7ms of persistence (1/60sec), no matter how slow or instant the pixel transitions are.

To achieve 1ms of persistence (to enter CRT motion quality league) using today's refresh rates, you need to modulate light (CRT, strobe backlight, strobe OLED, etc), by illuminating each frame for just 1ms. You could slowly modulate/decay the strobes, to soften the effect, instead of squarewave, but that lengthens persistence, and thus adds more motion blur/ghosting effects. At least until we're able to do 1ms of persistence with zero light modulation (aka 1000fps@1000Hz -- each frame 1ms). Real life doesn't strobe; real life does not operate on frames. But anything attempting to resembling that (e.g. >1000fps@1000Hz) is not going to happen soon. So that makes light modulation necessary (aka strobing, phosphor decay, etc.). Motion tests (e.g. testufo.com, PixPerAn, etc) shows this, and science papers already cover this, use Google Scholar (science paper search) for "MPRT" and "LCD motion blur" sometime, but a fun DIY is to run different refresh rates & strobe on/off modes with exactly the same motion test of the same motion speed (choose one).

I prefer we don't strobe, but... until 1000fps@1000Hz is possible to create 1ms of persistence without light modulation of any sort, we'll have to live with strobing as a band-aid for now, or go back to our old vaccuum-filled glass balloons (CRT's) that still produce a great picture. The great thing is that strobing is a selectable choice in new LCD's. Mike of Valve software also covered this topic (valvesoftware.com link) as well, also mentioning 1000Hz.

nVidia just revealed strobing will be an officially sanctioned feature (blurbusters link) in G-SYNC, unlike the unofficial LightBoost. It will be easily enabled/disabled using nVidia menus. So G-SYNC monitors includes a sequel to LightBoost. It's a fixed-rate strobe mode at the moment, and doesn't vary, so you can only choose strobed (fixed refresh rate) or choose non-strobed (variable refresh rate).

So it'll be official and sanctioned; and it's a choice. Easy ON/OFF in nVidia menus. PWM-free mode. Strobe mode. Everyone's happy.
Last edited by mdrejhon on Thu Oct 24, 2013 12:52 pm, edited 6 times in total.
Thanks
Mark Rejhon
mdrejhon
Gerbil
 
Posts: 49
Joined: Tue Feb 05, 2013 10:50 am

Re: G-Sync Monitors?

Postposted on Thu Oct 24, 2013 12:35 pm

Moving into a less controversal subtopic, as this is a wide benefit:

Another interesting thing about G-SYNC is that decouples frame delivery times from frame refreshing length.
So it can still benefit fixed-refresh-rate operation. It is now possible to have 60fps@60Hz, but with only 1/144sec of frame delivery time (and 1/144sec scanout), from the computer to the monitor. Traditionally, 60Hz refreshes needed 1/60sec to deliver frames from the computer to the monitor (top-to-bottom scan). This could be great for reducing input lag.

Software-controlled fixed refresh rate:
- Play movies 24fps@48Hz
- Play videos 30fps@30Hz
- Play movies 48fps@48Hz
- Play television 60fps@60Hz
- Variable frame rate video files
- Future movie formats (e.g. 72fps, 96fps)
- Play mixed TV material and dynamically detect 24p, 30p, and 60p material, dynamically rate-adapt to new fixed rate (with zero mode-change flicker)
- Play emulators 60fps@60Hz with lower input lag (taking advantage of 1/144sec frame delivery times)
- etc.

The software controls the timing of the frames now. It doesn't have to be always variable; it can even choose a fixed rate of frame delivery, rather than variable frame delivery to monitor. Apparently more useful practical applications than I thought, even outside of games, too.
Thanks
Mark Rejhon
mdrejhon
Gerbil
 
Posts: 49
Joined: Tue Feb 05, 2013 10:50 am

Re: G-Sync Monitors?

Postposted on Sat Oct 26, 2013 4:58 am

Yeah, agreed that "blur" as seen by most people is a combination of sample & hold persistance AND pixel response delay but in the interestests of simplification, especially with "1ms" gaming screens, pixel response is insignificant compared to the 16.7ms or 8.3ms of blur caused by sample & hold. It's actually more accurate to always descibe the blur or ghosting on non-strobing screens as "one uwanted frame", no matter what refresh and what pixel response your screen has. Whilst strobing effectively fixes blur or ghosting, it has nothing to do with fluidity of motion, only the perceived sharpness of moving objects your eye follows across a screen.

G-Sync really interests me, because it solves two far more significant problems (to me, at least):

1) Tearing
Tearing has no place in modern games with
  • high-def art assets
  • Anti-aliasing to remove even a single pixel of jagged edge
  • Aniso filtering to remove even the subpixel pattern of shimmering our brains can notice in a moving texture
To spend all of that processing effort on subpixel precision and perfection, to then put at least one, perhaps several giant rips in the image seems beyond ridiculous....
In the case of twitch gaming with only a couple of refreshes to represent a 180 degree camera flick, a well-timed screenshot is all you need to show the true horror of what tearing can do! (you'll have to excuse the low quality settings, I needed a couple of hundred FPS to get that triple tear)

2) Fluidity of motion
That was the point of my original (if somewhat inebriated) post in this thread, pre-lightboost derail; Fluidity is the single most important goal of a display.
The whole point of cinema and television is to create the illusion of motion from static images; the more believable the illusion, the better the display.
<insert large, flashing, epileptic-fit-inducing signature (based on the latest internet-meme) here>
Chrispy_
Graphmaster Gerbil
Gold subscriber
 
 
Posts: 1462
Joined: Fri Apr 09, 2004 2:49 pm

Re: G-Sync Monitors?

Postposted on Sat Oct 26, 2013 12:04 pm

Chrispy_, excellent, you speak the same technical language;
Although Blur Busters focuses on the simplistic stuff, trying to educate the masses, I could write far more advanced articles, but it's so difficult to explain to people who think this is all display voodoo... :wink:

Chrispy_ wrote:Yeah, agreed that "blur" as seen by most people is a combination of sample & hold persistance AND pixel response delay but in the interestests of simplification, especially with "1ms" gaming screens, pixel response is insignificant compared to the 16.7ms or 8.3ms of blur caused by sample & hold. It's actually more accurate to always descibe the blur or ghosting on non-strobing screens as "one uwanted frame", no matter what refresh and what pixel response your screen has. Whilst strobing effectively fixes blur or ghosting, it has nothing to do with fluidity of motion, only the perceived sharpness of moving objects your eye follows across a screen.
Yes, one way of saying that. It needs further expansion further, however:

1. Fluidity of motion without eye tracking.
-- Correct -- doesn't help -- you get strobe stepping effects of discrete framerates. Wagonwheel effects. Mouse dropping effects. (Move mouse in circles). It's not smoother with strobing. Strobing doesn't fix stroboscopic effects. In games, staring stationary at crosshairs while stuff passes in front of you, you may see stroboscopic effects of the finite framerates. You might even see the individual frames better and clearer (clearer stroboscopic effect) but smoothness is not improved.

2. Fluidity of motion with eye tracking.
-- In certain situations, strobing contributes to improved feel of smoothness: Perceived fluidity of motion during VSYNC ON framerate=Hz (120fps@120Hz) and eye tracking is accurate (e.g. saccade errors of less than ~1 pixelwidths). Example: http://www.testufo.com/photo on a LightBoost display looks like "perfect smooth motion" (if using a browser that has working VSYNC). Much smoother than non-LightBoost 120fps@120Hz (because motion blurring tricks the brain to thinking it's less smooth, or from PWM artifacts from multiple strobes per refresh).

Certainly, smoothness can become worse with strobing and tearing, since strobing can amplify visibility of microstutters and visibility of tearing (without motion blur to hide them). But with perfect framerate=Hz (VSYNC ON), there is no microstutters, and perceived smoothness can improve during strobing, as long as your mouse isn't the limiting factor in smoothness. e.g. Older source engine games are now generally capable of perfect framerate=Hz VSYNC ON now on modern GPU's; VSYNC ON running Source Engine games, on GTX680 and faster, currently have improved smoothness with LightBoost enabled. For this reason, I almost always play LightBoost with VSYNC ON on those games, with a good 1000Hz mouse, and make sure I sustain 100fps@100Hz or 120fps@120Hz to minimize input lag penalty of VSYNC ON as much as it allows.

G-SYNC eliminates the need to keep a perfect fixed framerate=Hz, so you get sustained smoothness whenever the refresh rate varies.

On a related subtopic -- some people track eyes during 180 flicks, while other people don't. Some have sufficiently fast eye tracking speeds to usually keep up with tracking objects during a 180 degree flick in first-person shooters, so it helps identify camoflaged enemies in bushes more quickly without motion blur, and stop mid-turn during a 180-degree flick (or even slightly slow mid-flick, shoot, finish flick -- essentially doing a 180 degree flicks while shooting enemies halfway for enemies 90 degrees off you, per se -- a select few gamers have that ability). If needed, suddenly seeing enemies that were 90 degrees the side of you, when you were just intending to reverse direction. However, some others religiously stare at the crosshairs, without much eye-tracking, during such situations. Benefit of a strobe-backlight-based display for a competitive gamer, varies from gamer to gamer -- it won't help everyone. Pro gamers used to staring at crosshairs (e.g. strafing and shooting things when it passes crosshairs) generally don't benefit from strobe backlights, while gamers used to eye-tracking (or games that force eye-tracking), the benefits of strobe backlight displays really show up (as we all already know from testimonials). And strobe backlights have the minor input lag penalty of waiting for a refresh to finish in darkness before strobing on completed LCD refreshes (high speed video), but as we now know, it can improve reaction times in certain use cases to more than compensate for that (games that force eye tracking tactics) (e.g. testimonials of improved scores during strobing).

The ultimate of the ultimate: Combining strobing with G-SYNC simultaneously. Gain the advantages of both. It's a very difficult engineering feat, due to the flicker risk of variable-rate strobe backlights. But probably a far simpler engineering fix than low persistence without light modulation (that requires ultrahigh update rates like 1000Hz if you don't want strobe/phosphor decay/subfields/temporal modulation of pixels). I've sketched a diagram of a variable rate strobing algorithm that doesn't create low-frequency flicker. Just to be clear, this algorithm is being given away for free (only asking for credit, if anyone uses it). Although does nothing for smoothness during non-eye-tracking situations; fortunately it's not mutually exclusive: It doesn't make combining G-SYNC with strobing completely impossible. Wouldn't be nice to get everything; smooth variable frame rates, zero motion blur during eye tracking, zero tearing, no stutters, all at once!

Image
[Quick-and-dirty napkin sketch diagram of variable-rate strobing without flicker -- an 'adaptive LightBoost' algorithm.]

Definitely, you're right -- strobe backlights mainly benefits eye tracking. However, lots of LightBoost users (including myself) would love to have both G-SYNC *and* strobing simultaneously. :D ...Now that G-SYNC monitors includes an optional NVIDIA-sanctioned strobing feature, it's not too entirely derailed to discuss trying to merge/combine the two (an engineering challenge).
Thanks
Mark Rejhon
mdrejhon
Gerbil
 
Posts: 49
Joined: Tue Feb 05, 2013 10:50 am

Re: G-Sync Monitors?

Postposted on Wed Oct 30, 2013 1:37 pm

I'm thinking that "flicker-free" from the old CRT days was about 85Hz. Some people claimed 60Hz was fine, others complained of flickering and headaches at 72Hz, but I doubt anyone really had a problem with 85 Hz.

A good option for G-Sync with strobing would be to have a constant, traditional backlight for refresh rates under 85Hz, and to strobe only from 85Hz or higher. It would be nice if strobing could be effective at lower frequencies but I don't think flicker is ever desirable, and I think I'm correct in saying that sample & hold blur is less noticable at lower framerates anyway; For an object moving across the screen at any speed (let's pick an arbitrary 1200 pixels/second) the lower framerate draws the object further away from its position in the previous refresh, meaning that your brain is less likely to interpret the two separate images as one blurred one. At 30 Hz, the object has moved by 40 pixels each update but at 120Hz the object has only moved 10 pixels. As I understand it, perception of sample & hold blur is very dependent on the ratio of object-size-to-blur-length, which is why small squares in the motion-blur tests seem to exhibit blur less obviously than larger ones.
<insert large, flashing, epileptic-fit-inducing signature (based on the latest internet-meme) here>
Chrispy_
Graphmaster Gerbil
Gold subscriber
 
 
Posts: 1462
Joined: Fri Apr 09, 2004 2:49 pm

Re: G-Sync Monitors?

Postposted on Wed Oct 30, 2013 1:58 pm

A good option for G-Sync with strobing would be to have a constant, traditional backlight for refresh rates under 85Hz, and to strobe only from 85Hz or higher. It would be nice if strobing could be effective at lower frequencies but I don't think flicker is ever desirable
Yep, that's what my diagram shows.

In other news, Eizo just announced another strobe-backlight monitor, the FG2421:
Techreport thread on Eizo Foris FG2421

Either way, it is great that more and more vendors are recognizing the motion blur benefits of strobing, at least as optional useful modes.
Thanks
Mark Rejhon
mdrejhon
Gerbil
 
Posts: 49
Joined: Tue Feb 05, 2013 10:50 am

Re: G-Sync Monitors?

Postposted on Wed Oct 30, 2013 3:18 pm

"Games implement motion blur because they refuse to run at 60hz on consoles, instead preferring to shoot for 25-30FPS to make for nicer screenshots. 25-30FPS looks awful, so they use motion blur to try and hide the effect. It still looks ridiculous and unrealistic, because it IS unrealistic. Your eyes blur moving images when they aren't focused on them. Artificially blurring an image which you ARE focused on is distracting and counterproductive. All pro-gamers turn off motion blur. The image on screen should stay clear and steady as it does in real life, and Lightboost allows this to happen. "

I'm sorry to say this, but both assumption are wrong. Even so you are correct about games over doing it.

The human eye / brain cannot distinct discreet changes at a high time resolution. The human Persistence of vision is said to be ~1/25 of a second.
Its easy to verify, hold you hand, focus on it and shake it like you would say goodbye... Focus on the tip of your fingers : Motion blur on focused objects.

Focus... you eye also need to focus, and this cause near or far object, even static to become ... wait for it.. out of focus.

Moving object in focus gets blurred
Static object our of focus gets blurred

That just how we work as human, bound by the physic of light and optics.

Now video games try to simulate many thing, but take shortcuts.. like lighting, shadow.. they are both clearly wrong and dont look real.
But its worse to have a game with no shadow at all then fake shadow.

The biggest issue with games (and movies) is that both medium cant match what your eye is doing in regard to focus... This become much worse with stereo 3D.

So what you describe is a problem with game exaggerating and not implementing correctly real world optics.
A game should really render like a camera, if your refresh is 1/60 your exposure time should be 1/60 : Where ALL the light over 1/60 of a second is captured, following the POV rule)
So if an object move across the screen in 1/60 of a second, if should 'smear' across the screen. like it would in real life.

Game rendering use a 1/'inifity' exposure, so at 1/30 you get stuttering animation. Its like an object de-materialize at one location and re-appear at another, and never existed in between the 2 locations.
The higher the refresh (<the number of frame actually rendered at discreet time step) the less of a 'time aliasing' you get.

The right way to solve this correctly way to compute costly, so most people use hacks....
And focus/dof , it can only be used sparingly as the content need to , like a magician, make us focus where the game wants. otherwise it look all broken.
sschaem
Gerbil Team Leader
 
Posts: 227
Joined: Tue Oct 02, 2007 10:05 am

Re: G-Sync Monitors?

Postposted on Wed Oct 30, 2013 3:47 pm

mdrejhon wrote:
A good option for G-Sync with strobing would be to have a constant, traditional backlight for refresh rates under 85Hz, and to strobe only from 85Hz or higher. It would be nice if strobing could be effective at lower frequencies but I don't think flicker is ever desirable
Yep, that's what my diagram shows.

In other news, Eizo just announced another strobe-backlight monitor, the FG2421:
Techreport thread on Eizo Foris FG2421

Either way, it is great that more and more vendors are recognizing the motion blur benefits of strobing, at least as optional useful modes.


VA, 1080p, 120Hz over DVI/DP, focus on lower input lag, control over USB- the only thing that strikes me as odd is that they put the frame-smoothing stuff in, which is sure to add input lag given that it needs two whole rendered frames in order to generate that extra frame for 240Hz output, and that seems counter to their claim of a focus on lowering input lag.
Canon 6D||[24-105/4L IS USM|100/2.8L Macro IS USM|70-300/4-5.6 IS USM|40/2.8 STM|50/1.4 USM|85/1.8 USM|Samyang/Bower 14/2.8 Full-Manual Rectilinear Wide-angle|
Canon EOS-M|11-22/4-5.6 IS STM|22/2 STM|EF-M 18-55/3.5-5.6 IS STM|
For sale!|24/2.8 IS USM
|
Airmantharp
Maximum Gerbil
 
Posts: 4695
Joined: Fri Oct 15, 2004 9:41 pm

Re: G-Sync Monitors?

Postposted on Wed Oct 30, 2013 3:56 pm

Review for the Eizo here.

I'd really like to see just what all can be done using a colorimeter, considering the measured 5000:1 contrast ratio would be incredible for gaming, and I'd love to see what traditional monitor review houses have to say about the actual input lag and pixel response- 1ms is not likely with a VA panel.
Canon 6D||[24-105/4L IS USM|100/2.8L Macro IS USM|70-300/4-5.6 IS USM|40/2.8 STM|50/1.4 USM|85/1.8 USM|Samyang/Bower 14/2.8 Full-Manual Rectilinear Wide-angle|
Canon EOS-M|11-22/4-5.6 IS STM|22/2 STM|EF-M 18-55/3.5-5.6 IS STM|
For sale!|24/2.8 IS USM
|
Airmantharp
Maximum Gerbil
 
Posts: 4695
Joined: Fri Oct 15, 2004 9:41 pm

Re: G-Sync Monitors?

Postposted on Wed Oct 30, 2013 4:16 pm

Airmantharp wrote:Review

:lol: That's not a review - it's a garbage advertising post. Just like everything that this "expert" site contains :wink:
My subscription allows you people to exist on this site and makes me a better human being than you'll ever be
JohnC
Gerbil Jedi
Gold subscriber
 
 
Posts: 1741
Joined: Fri Jan 28, 2011 1:08 pm
Location: NY/NJ/FL

Re: G-Sync Monitors?

Postposted on Wed Oct 30, 2013 4:29 pm

JohnC wrote:
Airmantharp wrote:Review

:lol: That's not a review - it's a garbage advertising post. Just like everything that this "expert" site contains :wink:


Well, it's also all there is- and I didn't see anything 'bad' on the face of it, just nothing good either :D.
Canon 6D||[24-105/4L IS USM|100/2.8L Macro IS USM|70-300/4-5.6 IS USM|40/2.8 STM|50/1.4 USM|85/1.8 USM|Samyang/Bower 14/2.8 Full-Manual Rectilinear Wide-angle|
Canon EOS-M|11-22/4-5.6 IS STM|22/2 STM|EF-M 18-55/3.5-5.6 IS STM|
For sale!|24/2.8 IS USM
|
Airmantharp
Maximum Gerbil
 
Posts: 4695
Joined: Fri Oct 15, 2004 9:41 pm

Re: G-Sync Monitors?

Postposted on Wed Oct 30, 2013 4:54 pm

sschaem wrote:I'm sorry to say this, but both assumption are wrong. Even so you are correct about games over doing it.

The human eye / brain cannot distinct discreet changes at a high time resolution. The human Persistence of vision is said to be ~1/25 of a second.
Its easy to verify, hold you hand, focus on it and shake it like you would say goodbye... Focus on the tip of your fingers : Motion blur on focused objects.
I obviously know this, if you read a later post --
auxy wrote:More to the point that I was making, though, objects under your crosshair -- objects your character is clearly immediately focusing on -- also get blurred. This is unrealistic. Things in motion only get blurry in your vision when your eyes can no longer track them. As long as your eye can track it -- and your eye, or at least my eyes, can track pretty damn fast -- and you stay focused on it, it should stay clear.
Read all the thread. :)

Besides, making claims like "the human persistence of vision" or "human ability" is nonsense -- the dynamic range of human ability is vast and varied, and talking about "what humans can do" is fairly pointless.
i5-3570K @ 4.4 (NH-C14), 4x8GB DDR3-1866, GA-Z68MA-D3H-B2, ASUS GTXTITAN-6GD5, 128GB Vertex 4 / 2x60GB Vertex Plus R2 / 2x2TB Barracuda 7200.14 RAID0 / ANS-9010 (4x4GB), SST-DA1000 (PSU), 2x VS229H-P, 1x VG248QE, 1x MIMO 720F, Corsair Vengeance K90+M95
auxy
Gerbil Elite
 
Posts: 779
Joined: Sat Jan 19, 2013 3:25 pm
Location: the armpit of Texas

Re: G-Sync Monitors?

Postposted on Wed Oct 30, 2013 6:18 pm

auxy wrote:More to the point that I was making, though, objects under your crosshair -- objects your character is clearly immediately focusing on -- also get blurred. This is unrealistic. Things in motion only get blurry in your vision when your eyes can no longer track them.


I think what you are saying is completely wrong, and it's skewing this thread away from the base concept of what G-Sync (with or without strobing) brings to gaming; I don't think you fully understand how human vision works (neither do I, but as a colourblind scientist I have spent years reading about human vision - it intrigues me both in the way it works, and also what I'm missing out on)

You explicitly go to the effort (in writing) to equate "objects under your crosshair" with "objects your character is immediately focussing on" and then say that blurring those objects is unrealistic.

Sorry to break it to you, but the crosshair is static; If something is moving relative to the crosshair, your character is not tracking it, and it should be blurred.

I think you misunderstand the way the eye blurs movement both on-screen and in the real world, but it's okay - Here's a really simple test you can do to help you understand better:
  1. get a sharpie and draw a dot/line/shape on your right thumbnail. As long as it's well-defined and you can focus on it, that's all that matters.
  2. Now stick up both thumbs together at arms length and focus on the left thumbnail
  3. keeping both thumbs at arms reach (to keep the focal distance the same), wave your right thumb around
The shape you drew on your right thumbnail is a blurry mess until you switch focus to your moving right thumb and track it. At that point, your immediate focus (the crosshair), is tracking the moving thumb and since the shape on your thumb is stationary relative to your immediate focus, it is sharp and well defined again. You could say it''s "in focus" again! This blurring of things that move relative to your focal point is a basic demonstation of retinal persistence, and unless you have completely different brain/eye chemistry to every human eye ever studied, this is how your vision should work too. If you're curious start Googling retinal chemistry since your retina has "lag" too but your brain is awesome at post-processing the lag away.

What doesn't help is that I think you are confusing your game character's focal point (the crosshair) with your own focal point when you observe motion on screen. What strobing does is help your eyes perceive motion across the screen more clearly as you track it. However, since the screen presents the focal direction of your in-game character, the very act of tracking an object moving across the screen means that you're not looking at the same thing as your character. What you character is looking at is in the middle of the screen, and it will STAY there. If you character looks at something else, the "something else" will still be in the middle of the screen.

The only thing I can think of that breaks that rule is a screen shake-during an explosion shockwave or similar, but I'd kind of expect my vision to be a little messed up *during* an explosion, so I guess it's an acceptable representation. I don't really want to subject myself to a real explosion to find out whether that's accurate or not ;)
<insert large, flashing, epileptic-fit-inducing signature (based on the latest internet-meme) here>
Chrispy_
Graphmaster Gerbil
Gold subscriber
 
 
Posts: 1462
Joined: Fri Apr 09, 2004 2:49 pm

Re: G-Sync Monitors?

Postposted on Wed Oct 30, 2013 6:29 pm

Chrispy_ is correct -- strobing only helps eye tracking.

auxy seems correct in this one situation -- only IF you're watching the moving objects, waiting for it to go under the crosshairs. Auxy did apparently say "objects your character is clearly immediately focusing on" which precisely suggests the tracking use case that Chrispy is suggesting. Yes, chrispy is correct if you're staring stationary at the crosshairs while waiting for the object to move underneath, strobing doesn't help. However, auxy may have meant he's eyetracking THE moving object while waiting for them to pass the crosshairs. In that case, auxy is correct too.

So basically:
(A) Your eyes are tracking THE moving object while waiting for it to pass crosshairs -- correct, strobing helps
(B) Your eyes are staring at THE stationary crosshairs while waiting for moving object to pass underneath -- correct, strobing doesn't help.
Different players play differently with their eyeballs...

If viewed that way, what auxy/Chrisphy is saying isn't mutually exclusive (if I'm interpreting both of you two correctly)
Last edited by mdrejhon on Wed Oct 30, 2013 7:10 pm, edited 1 time in total.
Thanks
Mark Rejhon
mdrejhon
Gerbil
 
Posts: 49
Joined: Tue Feb 05, 2013 10:50 am

Re: G-Sync Monitors?

Postposted on Wed Oct 30, 2013 6:45 pm

Yes, Mark, you are correct.

Chrispy_, I called you a jerk earlier because you're being a jerk to me. You're not even trying to understand what I'm saying -- you're not looking at my posts and going "well, what does she mean? how can I understand these statements to mesh with my understanding?" which would be giving me the benefit of the doubt and the polite way to discuss something.

Instead, you're reading my post and looking for ways to read me that come out with me being wrong. That's just rude. Not only that, but you're trying to paint me as taking the thread off-topic, when all of this argumentativeness is based on one offhand comment that I made about the market viability of G-Sync among hardcore gamers.

That's really rude, and I don't know what your problem with me is, but you really should lay off.
i5-3570K @ 4.4 (NH-C14), 4x8GB DDR3-1866, GA-Z68MA-D3H-B2, ASUS GTXTITAN-6GD5, 128GB Vertex 4 / 2x60GB Vertex Plus R2 / 2x2TB Barracuda 7200.14 RAID0 / ANS-9010 (4x4GB), SST-DA1000 (PSU), 2x VS229H-P, 1x VG248QE, 1x MIMO 720F, Corsair Vengeance K90+M95
auxy
Gerbil Elite
 
Posts: 779
Joined: Sat Jan 19, 2013 3:25 pm
Location: the armpit of Texas

Re: G-Sync Monitors?

Postposted on Wed Oct 30, 2013 7:03 pm

auxy/Chrispy .... Now that we know what the misunderstanding was (both of you were apparently correct about the crosshairs, but for different reasons), and understand why that was the case, let's pretend it never happened. I don't want this pretty interesting thread locked.

________________________________________________

Moving on....
Anyway, back more-or-less topic (considering G-SYNC monitors also include a strobe mode, and asynchronous refreshes), regarding strobe backlights:
I found out why Eizo does a two-pass refresh.

1. First pass refresh is overdriven, but in total darkness. (erases previous refresh)
2. Second pass refresh is clean, but still in total darkness. (erases overdriven refresh)
3. Strobe backlight flashes at end of second pass refresh. (clean refresh seen by eyes)
(Citation: page 15 of Eizo FDF2405W monitor, uses the same technique as Eizo FG2421).

So you have VA without major ghosting/overdrive artifacts.
Basically, 240Hz is simply to fix VA artifacts, and have very clean looking 120Hz worth of visible strobing.
So even while Eizo is doing this black frame insertion equivalent, it's also simultaneously doing something special with the LCD panel in total darkness with the two-pass refresh to clean up the refresh and reduce/eliminate VA artifacts. So that's another additional purpose of strobing -- hide the overdrive artifacts.
Thanks
Mark Rejhon
mdrejhon
Gerbil
 
Posts: 49
Joined: Tue Feb 05, 2013 10:50 am

Re: G-Sync Monitors?

Postposted on Wed Oct 30, 2013 7:36 pm

That' EIZO is looking a whole lot more attractive now:

You're half a cycle behind a standard 120Hz screen, but that's not such a big deal because 1.5 cycles of 120Hz is still less than a single cycle at 60Hz, and 1 frame of input+response lag (16.7ms) on a 60Hz panel is generally accepted as lag-free.

Auxy. I really am not trying to make this a personal attack (and you should try not to read everything as if it *is* that way):

Sschaem explained why you are wrong, citing persistence of vision - as I have done.
You rebuked that with "I obviously know this" and then immediately gave an example of what you think qualifies as a valid reason for this rebuke:
objects under your crosshair -- objects your character is clearly immediately focusing on -- also get blurred. This is unrealistic

Trying to understand it from your perspective, I interpret how you might have *meant* that quote to read in exactly the way mdrejhon called it:
auxy seems correct in this one situation -- only IF you're watching the moving objects, waiting for it to go under the crosshairs.

But if he and I are both right in thinking that's what you *meant* (his option A) rather than what you actually wrote (his option B) it just means that you *ARE* confusing/mixing the in-game character's focal point with your own on-screen focal point.

Those two things are not the same, and in the context of motion-blur the focal point of the character is *THE* deciding factor in what should be blurred and what should not.

Neither of us are trying to attack you, it's a discussion - reasoning with you - You have rejected our assessments that you're wrong and given a reason why you think you're right. All we're doing is explaining to you with simple, fact-based information that you really don't understand something. A teacher explaining an easily-provable fact to a student who does not understand is not an attack on the student, it is simply the teacher trying to make the student see that proven fact in a way they can both agree on.

If you think I am being a jerk then that's just my blunt way of phrasing things, and I apologise for any undue offence but I genuinely believe you take an interest in frame-rates/fluidity/blur and I want to help you understand it better by correcting some underlying mistakes in your reasoning. If you don't think *my* reasoning is correct then I'm open to criticism, but I can't really do much with:
Auxy wrote:You're a jerk.

Give me something more substantial to work with ;)
<insert large, flashing, epileptic-fit-inducing signature (based on the latest internet-meme) here>
Chrispy_
Graphmaster Gerbil
Gold subscriber
 
 
Posts: 1462
Joined: Fri Apr 09, 2004 2:49 pm

Re: G-Sync Monitors?

Postposted on Wed Oct 30, 2013 8:51 pm

Chrispy_ wrote:Those two things are not the same, and in the context of motion-blur the focal point of the character is *THE* deciding factor in what should be blurred and what should not.
How are they not the same? You look at places on-screen besides the center of it often, do you?
i5-3570K @ 4.4 (NH-C14), 4x8GB DDR3-1866, GA-Z68MA-D3H-B2, ASUS GTXTITAN-6GD5, 128GB Vertex 4 / 2x60GB Vertex Plus R2 / 2x2TB Barracuda 7200.14 RAID0 / ANS-9010 (4x4GB), SST-DA1000 (PSU), 2x VS229H-P, 1x VG248QE, 1x MIMO 720F, Corsair Vengeance K90+M95
auxy
Gerbil Elite
 
Posts: 779
Joined: Sat Jan 19, 2013 3:25 pm
Location: the armpit of Texas

Re: G-Sync Monitors?

Postposted on Wed Oct 30, 2013 9:17 pm

Let's remember gameplay styles vary from human to human.
Some pro players stare religiously only at crosshairs at centre, using peripheral vision to bring stuff into crosshairs.
Some people track eyes all over the place. Some people do not. And varies widely from game to game.
High speed helicoptor flybys. High speed drivebys. Fast flick 180. Fast corridor running versus waiting stationary as a sniper. Etc.
And even with those, eye behavior of different humans vary a lot.
Thanks
Mark Rejhon
mdrejhon
Gerbil
 
Posts: 49
Joined: Tue Feb 05, 2013 10:50 am

Re: G-Sync Monitors?

Postposted on Thu Oct 31, 2013 11:23 am

auxy wrote:
Chrispy_ wrote:Those two things are not the same, and in the context of motion-blur the focal point of the character is *THE* deciding factor in what should be blurred and what should not.
How are they not the same? You look at places on-screen besides the center of it often, do you?


The crosshair represents the direction your game character is looking in. Their focal point will be static in the middle of the screen at all times.

Because we look at places on-screen besides the center all the time, our focal point is not the same as the character's focal point. Motion blur is in games is calculated relative to the character's focal point so the distinction between your focal point and the character's focal point is of critical importance when deciding what blur is and isn't desirable.

Perceived blur is both accurate and sometimes even desirable if you match the focal point of the game camera (stare at the crosshair)
You will want to minimise this blurring with strobing, low-response panels and disabling motion blur effects if you wish to track moving objects on screen that aren't at the focal point of the game camera.

Motion blur isn't always bad, and in some cases (when applied subtly) it can actually enhance the realism. With strobed displays, or old CRTs, objects moving across your peripheral vision are artificially sharp due to the strobing effect which is completely different to how it works with real movement in peripheral vision. This is actually a competitive advantage (which is why it took me so long to give up my CRTs), but outside of competitive gaming it's actually distracting and less realistic.
<insert large, flashing, epileptic-fit-inducing signature (based on the latest internet-meme) here>
Chrispy_
Graphmaster Gerbil
Gold subscriber
 
 
Posts: 1462
Joined: Fri Apr 09, 2004 2:49 pm

Re: G-Sync Monitors?

Postposted on Thu Oct 31, 2013 11:21 pm

Right, GPU-calculated motion blur effects -- are calculated based on motion relative to crosshairs...
While display motion blur effects occur as you track your real-life eyes on moving objects relative to the plane of the display itself (independently of the virtual game character's intended focal point, aka, the crosshairs).

Ideally, in the distant future:
-- Games should never have to continually add motion blur; there are times where we'd like motion blur to 100% natural by human eye without the game or display adding any (out of necessity to smooth out strobe effects).
-- Monitors should not need to use strobe backlights to eliminate motion blur. (avoids stroboscopic effects, wagonwheel effects, etc)
-- Motion blur should only be added artistically, ideally, and not at all times. Real life scenery isn't adding motion blur to itself on its own, outside of our brains.
-- Continuous light should be used (no light modulation, no phosphor decay, no plasma subfields, no strobe backlight, no CRT scanning, no laser scanning, no flicker), since real life is composed of continuous light.

Unfortunately it's hard to simultaneously eliminate motion-blur effects and stroboscopic-effects _at_the_same_time_ (on a finite-framerate display), so right now, we're stuck with compromises such as adding artifical GPU motion blur effects, or using a strobe backlight, etc. Even as we go from 120Hz->240Hz->500Hz->1000Hz, there will still be mouse strobing effects (move mouse arrow pointer in a circle in front of a black background). The distances between mouse droppings will shorten and shorten at higher Hz, but not quite yet reach the appearance of infinite Hz and the mouse pointer looking like it's a paper arrow cutout being waved around (continuous motion, no gaps, naturally generated by human brain). Likewise, wagonwheel effects still occur even at 500Hz and beyond -- a 500Hz wagonwheel (8-spokes) spinning 1/8th of a turn per refresh, will always look stationary. There are always stroboscopic issues with finite-refresh-rate displays, alas. The dream of the framerateless Holodeck will remain out of reach for a long time.

Realistically (in a decade from now or more), it's possible that 1000fps@1000Hz could allow a reasonable compromise in simultaneously eliminating motion blur & stroboscopic effects. No strobe backlights. No motion blur. Insignificant stroboscopic/wagonwheel effects. No GPU motion blur effects (or only very minor -- 1ms of motion blur needs to be added to eliminate strobe effects -- imperceptible to human eye yet eliminating wagonwheel effect problem). Frames generated with only 1ms of latency. Even ability to use VSYNC ON with only 1ms of added input lag. Or perhaps as a max framerate of a theoretical variable refresh rate monitor (e.g. G-SYNC). Although the four-digit framerate/Hz looks silly big, and there are points of diminishing returns, it would solve quite a lot of side effects for people who simultaneously notice _both_ strobe effects & motion blur effects (and are annoyed by both). It's like a law of physics for displays -- being unable to eliminate motion blur (whether GPU/display) simultaneously with eliminating stroboscopic/wagonwheel effects -- you can do one or the other but never both.

One can dream of combining the CRT motion clarity benefits, while eliminating stroboscopic effects (an achievement that may not occur until ultrahigh framerate occurs, or exotic framerateless/refreshrateless technologies occur that accomplishes continuous movement without the use of static frames). Then game makers can add as little or as much motion blur as they want, and even avoiding adding GPU motion blur effects will no longer cause stroboscopic effects. Then only then, all 100% of motion blur will be completely naturally generated by the human brain, with no external motion blur added (either by GPU or display). But then again, we're getting into far-future talk / Holodeck stuff where nothing looks "off" (no externally injected motion blur, no strobe effects, etc)...
Thanks
Mark Rejhon
mdrejhon
Gerbil
 
Posts: 49
Joined: Tue Feb 05, 2013 10:50 am

Previous

Return to Visual Haven

Who is online

Users browsing this forum: No registered users and 4 guests