G-Sync monitors flicker in some games—and here’s why

Displays equipped with Nvidia’s G-Sync variable-refresh tech are pretty great. But they’re not perfect: some users have been reporting slight flickering in some games.

The guys at PC Perspective did a little sleuthing this week, and they’ve both confirmed the problem and identified its cause. Turns out the issue has to do with the way G-Sync handles “stalls” in game animation—that is, cases where the frame rate briefly dips to zero, as on some loading screens or when content loads in the background. G-Sync displays can’t simply stop refreshing the image when that happens, so a failsafe measure kicks in:

Completely stopping the panel refresh would result in all TN pixels bleeding towards white, so G-Sync has a built-in failsafe to prevent this by forcing a redraw every ~33 msec. What you are seeing are the pixels intermittently bleeding towards white and periodically being pulled back down to the appropriate brightness by a scan.

PC Perspective measured this effect by taking continuous brightness readings in EVE Online with an Asus ROG Swift monitor. (In that game, the problem rears its head when the user snaps a screenshot.) What happens is basically a “very slight brightness variation,” which the site neatly graphed over time.

So, is there a fix? Not really. PC Perspective says all of the variable-refresh displays it’s tested exhibit the same problem to some degree, and Nvidia is chalking up the problem to the way LCD monitors work. “All LCD pixel values relax after refreshing,” Nvidia told the site. “As a result, the brightness value that is set during the LCD’s scanline update slowly relaxes until the next refresh.”

If better LCD panels aren’t the solution, then perhaps Nvidia simply needs to cook up a better algorithm for handling frame-rate stalls.

Comments closed
    • sweatshopking
    • 6 years ago

    LET ME SUM UP THIS PAGE OF COMMENTS: QQQQQQ

    • mdrejhon
    • 6 years ago

    Main source of GSYNC flicker is actually caused by “LCD Inverion”

    I noticed a large proportion of flicker is caused by the LCD inversion algorithm, rather than TN fade-to-white. The inversion artifact looks like a fine checkerboard pixel pattern (see [url=http://www.testufo.com/inversion<]TestUFO Inversion[/url<]) caused by alternating DC polarities every opposite refresh passes, opposing DC voltage polarities, in a checkerboard pattern. You can see in the TestUFO test that inversion causes artifacts on most LCD displays. This actually causes more of the flicker, than the TN nature of fading to white. This is what I have noticed. Especially at framerates just below or above 30fps. Y.ou end up getting stationary checkerboard pattern effects that briefly flicker to opposite polarity and then back. So you got asymmetric time between polarities; leading to the flicker problem. This is actually one of the bigger cause of GSYNC flicker than TN fade-to-white! (but is kind of related, inversion balances the fading effect) Also, another possible fix -- it could predictively automatically do an early re-refresh, and keep re-refreshing more often until the GPU was very certain that the currently-rendering frame is finished during the next ~1/144sec (the time of a refresh scanout). That way, this will more equalize the time spent between opposing LCD polarities (inversion), and thus, reduce or eliminate the GSYNC flicker. Also, OLED can also benefit from GSYNC, but OLED also has a fade-to-equilibrium when you stop refrehsing an OLED. It is a very slow fade, but it may still cause flicker during GSYNC situations. However, there's no doubt that engineering solutions will help improve this.

    • SonicSilicon
    • 6 years ago

    There is, what Japan Display Incorporated calls, “Memory in Pixels” which could be a solution to the flickering. They originally showcased it in reflective backed displays, marketing it as a low-power LCD and an alternative to electronic ink and electronic paper.
    [url<]http://www.diginfo.tv/v/12-0207-r-en.php[/url<] The video doesn't look too promising, though, with a ghosted and rather low rate of refresh, but that was three years ago. I don't hold much hope since I haven't heard about it since.

    • HisDivineOrder
    • 6 years ago

    In about three months, nVidia is going to announce their new “G-Sync GTX” series. “Improving upon the already incredible smoothness provided by ‘G-Sync,’ ‘G-Sync GTX’ adds a faster processor to ensure that your TN LCD is never flashing white when it shouldn’t be. This is done by an nVidia-proprietary technology called, ‘G-Sync StillSync.'”

    “G-Sync GTX is being introduced currently as an add-in board for one specific model of Asus monitor and only that monitor. It cannot be added to any other model of display, cannot be currently purchased in any monitor for six months, and will be available for $399.99 for the controller board. As a bonus feature, nVidia is also introducing the ‘G-Sync Enabling Subscription’ for $9.99 per month*. As a perk for subscribing, the G-Sync Enabling Subscription also entitles non-Tegra users to a 15-day trial to nVidia GRID**.”

    “As always, nVidia continues to innovate in exciting and new directions. Now you can help fund the future by subscribing to one of many new and exciting nVidia services.”

    “Coming 2016: Geforce Experience Plus, a new subscription service to get drivers in beta.”

    *This optional service enables G-Sync GTX functionality to be activated on each system where an active subscription is paid. This subscription is in addition to the $399.99 add-in board.
    **GRID is currently available for free to nVidia Shield device owners and will be available to users of other nVidia hardware on a more widespread basis sometime in 2015 for a low monthly charge of $24.99.

    • Wirko
    • 6 years ago

    In before someone patents the obvious: use simple calculation to simulate the behaviour of the LCD matrix, then modulate the backlight to counter the undesired effect.

    • Star Brood
    • 6 years ago

    I have an ASUS 144Hz monitor with Strobelight installed. Works like a CRT. No problems after 18 months of use.

    • SonicSilicon
    • 6 years ago

    To my understanding, VESA’s Embedded DisplayPort on laptops directly controls the panels.
    So, what did VESA do with their power saving tech, that FreeSync will be based on, so that it doesn’t cause flickering? Seems like Nvidia overlooked something with G-Sync.

      • MathMan
      • 6 years ago

      During that mode, the laptop is still in a fixed refresh rate: constant 45Hz, then constant 60Hz, whatever.

      It doesn’t change from 144Hz to 0Hz in one go. In the latter case, the monitor can’t know up front that it suddenly will need to switch from high refresh rate to a low refresh rate, so it can only react, no prepare.

        • SonicSilicon
        • 6 years ago

        Ah!
        Thanks you for the information. I knew there was something I had missed about the VESA standard.

        I suppose this means AMD could end up with the same flickering issue in FreeSync using the current tech on the market.

    • anubis44
    • 6 years ago

    And just think, people are actually paying extra for this new problem. 🙂

      • Westbrook348
      • 6 years ago

      It’s an expensive solution, but I’m someone who can’t handle screen tearing at all. Every time I try to disable V-sync, I end up turning it back on, I hate the artifacts so much. And V-sync has stuttering whenever frames start to take longer than 16.7ms to draw, not to mention the input lag. If I have to pick between tearing and V-sync, I choose the latter. But I’m super excited about the prospect of no longer having to. I just hope the price comes down or AMD wins this round… I also think this new flickering problem will be solved sooner rather than later by implementing higher min refresh rates (35-40Hz instead of 30Hz).

        • GrimDanfango
        • 6 years ago

        As someone who’s equally annoyed by jitter like this, G-Sync has me entirely sold. Provided you’ve got a rig that can push 35 fps minimum, it feels almost flawless… it just feels like the way we should always have been doing things. Pan a camera, and features move at a consistent buttery-smooth rate across the screen.

        The only problem is, as robust and simple as the implementation is, it won’t work in a few games due to it requiring a true fullscreen mode. A couple (but certainly not all) Unity-engine based games seem to only bother implementing fullscreen as a borderless window (Wasteland 2 and Endless Legend are the only culprits I’ve found so far). The fact that it’s not all of them leads me to place the blame on developers. G-Sync works perfectly with plenty of old games, so it’s not like it really requires any great efforts to support.

          • evilpaul
          • 6 years ago

          You can possibly force true fullscreen using GeDoSaTo with some .INI editing. Some will probably ignore it or crash because they don’t handle Focus right though.

        • Platedslicer
        • 6 years ago

        That’s funny, I’m in exactly the opposite boat. I only notice tearing occasionally, and it doesn’t bother me much; but any form of v-sync has me pulling my hair out in under 5 minutes.

          • MEATLOAF2
          • 6 years ago

          I’m the same way, Vsync (depending on the game I suppose) is one of the first things I disable, along with DoF and motion blur. I just can’t handle the input lag, and lower average frame rate.

          I use RTSS to cap the fps at 60.

            • Westbrook348
            • 6 years ago

            When I enable Vsync, I lower graphics settings enough to keep frame times consistently <16.7ms. Otherwise your FPS is 30 fps most of the time, making Vsync as bad as the screen tearing. I will even sacrifice all AA if I have to (even on 1080p) to keep frame times down, but that’s a last resort. Screen tearing bothers me THAT much. Much more annoying than DoF and motion blur post processing effects even (and that’s saying something), though I disable those too if I can. I don’t understand how people tolerate horizontal lines and split images on literally every frame. Maybe OK for turn based games like Civ, but if there’s any movement, it’s the worst..

            The only time I might turn off Vsync is if a game is so new that I can’t get decent frame times even at medium settings, native res, and no AA, with my 7950. Then screen tearing might be better than the constant 30fps you get with Vsync on and frame times >16.7ms. But I don’t buy too many brand new games; I wait for sales. And the AC Unity fiasco only reinforced that.

            This is why I’m so excited about Gsync. I can increase graphics settings (often to Ultra), add more AA, get 40-60 up to 144 fps, AND no screen tearing. Best of all worlds. Well, of most worlds.. They’re not making 1440p 120Hz IPS Gsync yet.

          • sweatshopking
          • 6 years ago

          I turn on vsync and don’t give a rats about tearing or input lag. I max gpu settings to get at least 20fps, and then I play. Seriously. I don’t lose because of 16ms. I lose because i’m bad.

    • Drewstre
    • 6 years ago

    I love the smell of progress in the morning.

      • Westbrook348
      • 6 years ago

      One step closer to 2nd gen G-sync!

    • GrimDanfango
    • 6 years ago

    The OBVIOUS solution is to stop using godawful LCD panels entirely, and get OLED monitors onto the damned market at last!

    I don’t know all the technical aspects, but I strongly suspect that OLED wouldn’t require a constant minimum refresh rate in order to hold an image, and would consistently maintain colour until the next refresh was requested.

    Plus, y’know, <1ms response time, true infinite black-levels, and all the rest of the benefits OLED have over LCD. Seriously, it’s about bleedin’ time! They’ve managed 65″ OLED TVs, and 5″ OLED smartphone screens, and they both look incredible. Why the hell are middle-of-the-road 24-30″ monitors lagging so far behind?

      • derFunkenstein
      • 6 years ago

      Given the way OLEDs burn in (like all the Samsung panels in their phones do when the static notification bar pixels turn dim over time) there’s no way I want OLED yet. I use my PCI monitor way more than I do my phone display, and my SGS4 already has some display dimming at about 15 months of use.

        • GrimDanfango
        • 6 years ago

        Hmm, didn’t spot any dimming with my Note 2 after 18 months…

        I wonder if they just cheap-out on manufacturing phone displays to keep costs down. I’ve certainly been less than impressed by Samsung’s build quality elsewhere.

        I’ve been reading for at least a couple of years now that the whole OLED short-lifespan thing was basically a fallacy now when it came to TVs, and that they easily matched LCD panels for longevity.

        I’m sure there are drawbacks, but LCD panels have a massive swathe of their own drawbacks and yet are basically accepted as standard. It’s not like most manufacturers would bother themselves with avoiding bringing a product to market unless it was demonstrably “perfect”. TN panels all but took over for a few years, based on the simple expedient of being barely-good-enough for the price.

        It just seems bizarre to me that aside from a few ultra-expensive Sony professional reference monitors, nobody seems to even be considering bringing one out.

      • mdrejhon
      • 6 years ago

      You do realize that GSYNC isn’t LCD specific?

      They are able to use GSYNC on OLED displays too.

        • GrimDanfango
        • 6 years ago

        That was essentially my point. I reckon OLED would be a much better choice to apply G-Sync/freesync to, as it doesn’t suffer from the inherant sluggishness and tendency to shift colour unless refreshed frequently. G-Sync seems like an ideal pairing with an OLED screen.

    • Krogoth
    • 6 years ago

    Silly babies are crying over a band-aid that tries to fix the known limitations of LCDs, but only reveals them in a more painful light.

    Like I said, G-sync and similar solutions are interesting gimmicks nothing more. The real solution is using a display technology that doesn’t suffer from the tearing/syncing issues in the first place.

      • trek205
      • 6 years ago

      and when you develop the perfect solution let us know…

        • Airmantharp
        • 6 years ago

        He’d have to demonstrate a competent understanding of why variable V-Sync is needed in the first place…

          • Krogoth
          • 6 years ago

          I understand it quite well.

          Unlike some of the crowd here. I don’t get a silly geekgasm over it and think it is the best thing since sliced bread.

            • Airmantharp
            • 6 years ago

            Best thing since sliced bread? I wouldn’t know how to make that comparison :).

            But it does solve what was once a difficult problem, and with relative ease and elegance.

            • Krogoth
            • 6 years ago

            >relative ease and elegance

            not according to this article and you have to get hardware that is tied to it like any other proprietary “solution”.

            Please refrain from drinking from the Green Apple kool-aid.

            The real winner is a display technology that doesn’t suffer from the problems in the first place.

            • Airmantharp
            • 6 years ago

            Because gamers have never had to buy a compatible monitor and video card before? Really?

            (and no, your arguments for some mythical display technology aren’t relevant either)

            • Krogoth
            • 6 years ago

            More power to you if you want to lock yourself in proprietary solutions that will end being phased out. It becomes little more than an expensive paperweight.

            The problem with whole tearing is a fundamental issue with how video cards and current display solutions work. It is stuff that days back to 1960-70s. It is only problematic to a vocal minority.

            G-sync and such are short-term solutions at best which come with certain caveats.

      • ptsant
      • 6 years ago

      [quote<] Silly babies are crying over a band-aid that tries to fix the known limitations of LCDs, but only reveals them in a more painful light. [/quote<] Well, for those who remember, CRT were not only bulky and heavy, but also had imperfect geometry. Do you remember battling with pincushion and barrel distortion and all kind of optical aberrations? How about non-square pixels giving a different horizontal and vertical DPI (a circle no longer looks like a circle)? I held to my premium CRT a rather long time, until TFT color became acceptable. Now, I just want a 120Hz non-TN panel with strobelight and FreeSync. I really, really, don't think you can see motion blur or other refresh artifacts at 120Hz + Strobe + FreeSync. It might be a band aid, but if it can fool your eyes, it's good enough.

        • Krogoth
        • 6 years ago

        CRTs may have imperfect screen geometry, but they are faster and have far better color accuracy then any LCD unit out there. LCDs are weak at gamma.

        Tearing has more to how fast and accurately your GPU can throw images onto the logic circuit of the monitor. G-Sync, Freesync and such remedy the accuracy part of it. The speed is limited by how fast the monitor can switch around the pixels. LCDs are inferior to CRTs in this respect.

      • wierdo
      • 6 years ago

      I’m hoping OLED panels will make this a temporary fix.

      I’ve been waiting for them to become mainstream for years now, seems it’s starting to finally move in that direction now, but very slowly I’m afraid.

        • Airmantharp
        • 6 years ago

        The only further issue with OLED is that while the panels refresh significantly faster, like plasma there’s still the issue of getting that signal to the monitor. OLED could stand in for Krogoth’s mythical solution if a new connection method were also designed that could take advantage of OLED’s significantly faster native refresh rate.

          • A_Pickle
          • 6 years ago

          Really? The issue with OLED that I see is that the color quality declines significantly after several years of use, blue especially…

          OLED is a good concept, but until it can match LCD’s in terms of longevity, I’m not terribly interested. I don’t do badly for myself, but I don’t want to be buying a new monitor after every three or four years because the color has started looking like butt.

      • blargg
      • 6 years ago

      Start with the 3D scene as objects in memory. This has to be rendered. Once rendered to memory, ideally you could instantaneously have this be refreshed at all pixels of the screen simultaneously. So you need the video memory to be part of the screen, directly accessed while rendering (and not be slower than normal fast video memory, or it’ll slow down rendering). This requires the video bus to run from the PC to monitor, or for the PC to be part of the monitor (or perhaps the 3D video card to be part of the monitor, rather than the PC).

      Short of that, you have normal video memory and transfer the image to the monitor when it’s finished. This transfer is going to be serial in some way, so scanline is as good as any. Unless you’re greatly improving the current video card-to-monitor bus speed, you’ve got current speed limitations. So what else is left? When and how often the PC sends a video image to the monitor. Ideally the PC can choose to do so, only when it’s just gotten new player input and immediately rendered the result; this gives lowest latency. And this is all G-sync is.

      The implementation of G-sync on LCD requires handling of when the PC doesn’t update for a while. The current approach has some slight artifacts is all. Implementing G-sync on another technology will encounter other problems.

      Even in the ideal, there is the big problem of motion blur: if the system uses a continuous backlight, you have guaranteed sample-and-hold motion blur. This affects LCD, LED, etc. If the system uses strobing, you eliminate this blur, but introduce the problem of how to handle the constantly-varying refresh period while keeping a consistent brightness and avoiding too low a strobe rate. You can’t just strobe more often, because that introduces motion blur or if you’re interpolating, interpolation artifacts (as far as I’m concerned, motion interpolation will never be free of artifacts).

      There are lots of inherent problems which have nothing to do with LCD, G-sync, or whatever. It’s not people choosing an inferior technology and complaining about their bad choice; it’s people identifying and finding solutions to significant problems with recreating reality with machines in a clean way.

      • mdrejhon
      • 6 years ago

      GSYNC will also benefit OLEDs too.

      However, OLED does have a fade-to-equilibrium (black) so they may flicker, depending on how fast the specific OLED panels’ tendency to fade in a refreshless situation.

    • Toby
    • 6 years ago

    FWIW I use my G-Sync monitor daily (for almost a year now, thanks TR) and haven’t seen this issue on anything I’ve played. Save one odd issue that crops up from time to time dealing with not coming out of sleep gracefully the experience has been great. I wouldn’t trade it in for anything; once you’re used to no tearing ever without the mouselag sometimes incurred by Vsync it would be very hard to go back.

    • xeridea
    • 6 years ago

    Wait…. Similar tech has been on notebooks for a while as a power saving thing, I don’t see my laptop screen flickering constantly when on the desktop. Seems like they should have done their homework better before flaunting their half baked technology that will last about…. 1 year.

    So … If your computer can’t keep a constant 30FPS you would just get flickering rather than screen tearing? That is sooo much better.

    • soryuuha
    • 6 years ago

    Solution..buy new nvidia gsync v2 monitor!
    Anyway, i wonder if gsync feature can be turned off?

      • auxy
      • 6 years ago

      Yes it can.

    • Voldenuit
    • 6 years ago

    The replacement N-SYNC technology will have an “Autotune” button.

      • juzz86
      • 6 years ago

      Have a few internets, lol.

      • AmazighQ
      • 6 years ago

      And while you are at it, have Justin Timberlake promote it everywhere.

    • Chrispy_
    • 6 years ago

    The OBVIOUS solution is to stop using godawful TN panels that “untwist” their pixels so damned fast. Nobody cares about 1ms response times, it’s not like games run at 1000fps….

    I’ll take 8ms response times with an 8-bit IPS panel please. Not only will the image quality be better, so will the colour gamut, the viewing angles and it won’t suffer from flicker because IPS doesn’t untwist fast enough to drift very far in 33.3ms

      • Airmantharp
      • 6 years ago

      Get me one of those 30″ LG curved 21:9 1440p IPS panels and I’ll jump.

        • internetsandman
        • 6 years ago

        thatll be my next monitor upgrade. Nothing else seems nearly as awesome right now, especially with the DPI problems in Windows and some games

        Semi-on topic, regarding the DPI scaling, I always thought UI elements, especially in games, would be made to take up a certain percentage of screen real estate, not a fixed number of pixels. I also thought that this percentage could be adjusted if need be. I guess that makes too much sense

          • MEATLOAF2
          • 6 years ago

          It depends on how you implement it. I’ve dabbled in Unity3d, and all you have to do is use the “screen.width” or “screen.height” parameter, combined with basic operators, and you can have UI elements automatically scale based on screen resolution. You can even take it a step further and allow the player to offset the value with a slider to increase or decrease different parts of the GUI.

          Honestly it seems like really bad practice to make UI scaling set in stone, whether it be manually unchangeable by the player, or not automatically scale based on screen resolution.

          I have a hard time reading some of the tiny UI elements and font sizes in some games, and I just can’t play them because of it. Never complain on the forums though, you’ll get yelled at by people using 1024×768 monitors, telling you that you need glasses.

        • Westbrook348
        • 6 years ago

        Isn’t the human FOV closer to 16:9 than 21:9?

        Don’t get me wrong, I’ve fantasized about ultra-wide before, including triple monitor setups. Euro Truck Simulator 2 just begs for peripheral vision. These big single ultra-widescreen panels look amazing. When will they get G-sync or 120Hz, though?

        For that matter, where are all the 30″ 4K monitors? I had a 30″ Korean Yamakasi 1600p and it beats 27″ every day of the week. Throw some G-sync in there..

          • Airmantharp
          • 6 years ago

          I look at 21:9 as a better solution than trying to do some sort of surround, where the small parts of the display on the sides are for peripheral vision while gaming.

      • Meadows
      • 6 years ago

      Wrong. I wouldn’t buy an IPS monitor of this type unless it could do at least 120 Hz refresh rates.

        • odizzido
        • 6 years ago

        ATM I won’t buy a monitor that isn’t IPS. I also won’t buy a monitor that isn’t 120hz.

        /waits

          • Grape Flavor
          • 6 years ago

          If I’m going to spend big bucks on a monitor I want 4K, 120Hz, G-Sync and IPS. I’m sure it will happen eventually.

          /waits

            • Westbrook348
            • 6 years ago

            Same!! But I’m in the market over the next 4-6 months, so I’ll probably choose a 1440p 144Hz over 4K 60Hz. It has to have G-sync, which seriously limits my options (two of the former and only one of the latter, as far as I know). IPS would be gravy, but I’m not going to wait for it. I’m one of those weirdos who sits directly in front of his monitor and is OK with meh viewing angles.

      • rahulahl
      • 6 years ago

      I care.
      I used to think 1ms was gimmick as well.
      But going from 16ms to 4ms with 144Hz proved me wrong.
      I play Counter Stike a lot in competitive matchmaking.
      After 1000 games I was stuck at nova 3 rank. Within about 50 games since getting the monitor I climbed up to Master Guardian Elite, which is about 4 ranks higher.

      I would certainly like IPS to be fast enough so I can ditch TN, but saying refresh rate does not matter is wrong. Besides the ROG is a pretty good monitor. Even though its really only about 4ms rather than 1ms, the 144hz factor is really important. I tried playing at 60hz after that and I see stutters everywhere that I never noticed before.

        • willmore
        • 6 years ago

        Oh, yeah, my placebo is better than your placebo!

          • Platedslicer
          • 6 years ago

          While I find the “Nova III to Guardian Elite” story a bit over the top, there is in fact a noticeable difference in smoothness. It’s not likely to be the determining factor, but it’s not placebo either.

            • rahulahl
            • 6 years ago

            Well, it might seem over the top, but it certainly is true.
            I even have the screenshots to prove it.

            Its like how people complain about 64 tick vs 128 tick servers. It might seem like 64 times a second should be quite a lot, but 128 is still noticeably better.

            Same way, 144Hz looks more fluid and feels more responsive at least in twitch games like Counter Strike.

          • Meadows
          • 6 years ago

          No placebo. The difference is easily noticeable. You do, of course, have to use your eyes.

          Hell, the difference between 60 and 80 fps is already painfully noticeable even on a cheapo LCD set to 75 Hz. I know because I’ve tried just to see it.

          The difference between 60 Hz and 120 Hz, then, is pretty much impossible to miss unless you have some sort of genetic disorder. Same as the 30-60 fps non-debate, except not as earth-shatteringly obvious.

            • Arbiter Odie
            • 6 years ago

            You are quite correct. Here’s the source of the divide, which really should be stated.

            A significant fraction of people cannot process what there eyes are seeing fast enough to use those extra frames. Unfortunately for you and me, they tend to be nerds. I can probably see (and use) 144 HZ, but I have a friend who cannot tell a smoothness change beyond 45-55 fps. Seriously!!! I can’t imagine trying to drive on the road like that, but I digress.

          • trek205
          • 6 years ago

          you are either ignorant or blind as hell not to see the difference between 60 and 144 hz. its night and day and will make 60 hz look and feel like sloppy stuttery mess after using it. overall I prefer to use my IPS for daily use but its certainly no placebo using a higher refresh rate screen. just panning the mouse around at 144 fps on a 144 hz is an unbelievably smooth and eerie experience the first time you see it.

          • auxy
          • 6 years ago

          You’re an idiot. Please sell all your computers and never touch another one.

            • willmore
            • 6 years ago

            Did I hurt the fanboys feelings?

        • Chrispy_
        • 6 years ago

        I ditched a high-end 120Hz TN Samsung panel almost as soon as my Korean 1440p IPS arrived.

        To be honest with you, the visual difference in motion between 60Hz and 120Hz is night and day but the thing that really makes motion tracking better for twitch gaming is [i]NOT[/b] the refresh rate, it’s mostly the stroboscopic backlight that eliminates sample-and-hold blur. This is why I still used a CRT for clan matches at least five years after everyone else switched to LCD screens.

        I’m not a competitive twitch gamer anymore but you do realise that 8ms IPS is still 1/125th of a second, right?

        144Hz IPS screens have been coming out of Korea for years, with TV manufacturers doing it years before that. I clock mine to 85Hz and it still can’t hold a candle to an 85Hz CRT screen for Quake Live matches because of the sample-and-hold blurring issue that comes from a constant backlight; It’s not condusive to the way our animal brains track motion.

        All of the 120Hz televisions are IPS or AMVA and they’re already selling 4ms AMVA panels in monitors that are theoretically capable of 250Hz. If someone tried to put out a TN television these days they’d be laughed at.

        IPS and especially AMVA panels are plenty fast enough for 120 or 144Hz. [b<]TN IS EVIL BUT THEY KEEP MAKING THIS JUNK BECAUSE WE KEEP BUYING IT - SELLING CHEAP PANELS AT HIGH PRICES IS EXCEEDINGLY PROFITABLE.[/b<] (Please stop perpetuating the myth that TN alternatives are too slow for gaming, because that's clearly not the case)

          • Meadows
          • 6 years ago

          I am not going to buy a TV for my desk.

      • Ninjitsu
      • 6 years ago

      7ms IPS here. Ghosting is quite obvious and fairly annoying at times.

        • Chrispy_
        • 6 years ago

        7ms actual, or 7ms of marketing lies? If your 7ms panel was actually 7ms, it could completely change the pixel twice over in the 16.7ms between screen refreshes. If you can see ghosting that’s proof that your panel has [i<]at least[/i<] a 16.7ms response, and for ghosting to be quite obvious, it's probably over 33ms (smeared across 2-3 frames). The "6ms" LG.Philips S-IPS or AH-IPS panels that are very common at 1440p are actually [b][i<]measured[/i<][b/] at 8-9ms. That's what I mean when I say "8ms panel". These "4ms AMVA panels" I've mentioned earlier are more variable with measured response times of 4-11ms (median response is 6ms) I had an "8ms" Dell panel about a decade ago that was tested at 35-45ms actual G2G response time, with an additional 15ms of input lag. Those marketing goons have been having a field day with pixel response times since day one.

          • blargg
          • 6 years ago

          At this LCD response time, sample-and-hold blurring dominates when it’s not a strobed backlight. An instant LCD would still look crappy if it didn’t strobe the backlight.

          • Ninjitsu
          • 6 years ago

          I dunno, 7ms G2G “with overdrive” is what Dell marketing says.
          [url<]http://www1.ap.dell.com/in/en/home/22lcd/dell-s2240l/pd.aspx?refid=dell-s2240l&cs=indhs1&s=dhs[/url<]

      • kristi_johnny
      • 6 years ago

      The real advantage of TN panels is that you can build a monitor with 120/144Hz refresh rate, which is great for desktop work, where you move a lot of windows of objects on the screen. Regarding games, it’s hard to reach that framerate, even with highend video cards.
      It would be interesting, atleast for me, to see IPS type panels (IPS/PLS/AHVA) that can reach 120Hz, they would be great to work with such monitors, even if the resolution would be 1440p, not 2160p, i would still buy, would be great for desktop work :).

        • derFunkenstein
        • 6 years ago

        Even with non-Gsync monitors, 144 or 120Hz monitors have a benefit in gaming of shorter refresh cycles. Standard V-sync on a 144Hz monitor could still render games at 72, 48, or 36FPS (assuming the GPU can hit those performance points), all of which are tangibly better than 30FPS on a 60Hz monitor.

          • Orb
          • 6 years ago

          And shorter refresh cycles lead to both better perception of motion and less input lag. But this is still a very much application problem, they are the primary source of microstuttering. If I were to make a game, I will make sure it runs at monitor’s refresh rate.

            • derFunkenstein
            • 6 years ago

            Yeah it needs to be handles in the program/game as well. Without a use for higher refresh rates its a waste to be sure.

      • derFunkenstein
      • 6 years ago

      It’s still a series of trade-offs. I’d love a 120Hz panel but right now the tradeoff that’s better for me is the color reproduction and viewing angles. As soon as I can get IPS-quality panels at 120Hz, I’d be interested. Fortunately for my wallet and my new monitor that seems a ways off yet.

      • Zizy
      • 6 years ago

      TFT’s fault. TN is doing what it is supposed to. And you would see the problem with IPS as well. It would start losing charge as quickly as those TN screens. (assuming same frequency for both)

      I will take <1ms response times with an 10 bit OLED panel please.

      • Bensam123
      • 6 years ago

      A lot of gamers care about 1ms response time, not everyone cares about having a 99.9% of the Adobe RGB color gamut available to them.

        • Arbiter Odie
        • 6 years ago

        Very true. A family member of mine has one of the late 2013 iMacs. Those models have very nice screens.
        But it smears. SO MUCH SMEAR, my eyes burn! I don’t care if the colors are slightly off, if what’s on the screen is distorting from low pixel response times.

        • dragontamer5788
        • 6 years ago

        [url=http://forums.shoryuken.com/discussion/190399/one-frame-link-math<]The one-frame link[/url<]. Fighting game players [b<]require[/b<] timing of 16ms regularly. If your screen alone takes up 8ms of latency... the rest of the monitor system only has 8ms before the game is too laggy for those top-tier fighting gamers. 8ms is actually a long time. Gamers care about minimizing latency to its absolute bottom. They really don't give two ****s about color gamuts either. Fighting game players are the type to blow $200+ on a full custom controller with Sanwa buttons to minimize the latency between button presses and character-action. They're not willing to give up on nearly 10ms of latency because of minor color issues.

          • Chrispy_
          • 6 years ago

          You’re confusing pixel response with refresh rate.

          At 60fps (60Hz, or 16.7ms intervals) an 8-9ms IPS panel will redraw the screen at the same time as a very fast 2-4ms panel like the ROG Swift. The difference is that the transition might be 90% complete (the value usually used when measuring pixel response) a good 5ms faster on the Swift, but the pixel response follows a diminishing curve which means that both panels reach 50% (the shift point, as it’s sometimes referred to) where your eye or a camera determines a change is happening much much faster.

          For the sake of argument, let’s say that the pixel transitions 50% in one third the time it takes to reach 90%, that means that your eye detects motion on a 8-9ms panel around 3ms after the refresh, and on the Swift at about 1ms after the refesh.

          Now factor in the input lag and processing time of the AMA overdrive and the 2ms difference becomes less significant Compared to a CRT the actual pixel responses to the shift points are now 6ms and 4ms respectively. Add the 4ms of average lag (between 0 and 8ms for a 125Hz USB game controller) and suddenly that’s 10 and 8ms respectively. If I’m feeling pedantic, I’ll throw in the game engine’s sample frequency to add yet more miliseconds of lag that take place between input and visible result, or vice-versa.

          Point is that pixel response is rarely the issue in the TN vs IPS/MVA/PLS debate. I’ve spent 15 years gnashing my teeth about framerates and lag and the visible shift point between the two technologies is just a couple of miliseconds apart.
          Compare that to the rolling refresh of a “lag-free” CRT and you realise that at 60Hz the bottom of the screen refreshes 16.7ms later than the top for the same frame? How is that [i<]not[/i<] 16.7ms of lag? 😉

            • rahulahl
            • 6 years ago

            Sorry for being pedantic, but USB polling goes upto 1000ms.
            I know my mouse is set to 1000 rather than the default 125.

            • Chrispy_
            • 6 years ago

            Dragontamer’s using a fighting game, which means 125Hz gamepad polling on PC.

            The XB360 pad is 125Hz, the PS3 pad is 100Hz. I honestly don’t know about third party stuff and haven’t spoken to my kinaesthetics guy since the new consoles launched.

            • dragontamer5788
            • 6 years ago

            I did a bit of research, it does seem like IPS panels have gotten a lot better now. There was a time when TN was the only name in the game with decent display lag statistics however.

            displaylag.com is my reference, and their top panels are all IPS, measured at 8ms at the bottom of the panel. (bottom of the panel has more lag because of reasons you’ve already mentioned).

            • Chrispy_
            • 6 years ago

            yeah, displaylag.com, blurbusters.com and tftcentral are probably the top three resources for this sort of info.

            • dragontamer5788
            • 6 years ago

            [quote<]Compare that to the rolling refresh of a "lag-free" CRT and you realise that at 60Hz the bottom of the screen refreshes 16.7ms later than the top for the same frame? How is that not 16.7ms of lag? ;)[/quote<] With the other issue settled, its time for [b<]me[/b<] to be pedantic. CRT Monitors have a VBlank section as part of their signal (aka Overscan). So the bottom line (the 480th line of 525 lines total in NTSC, different numbers for European / PAL) refreshes 15.26ms after the top line.

            • Bensam123
            • 6 years ago

            It all adds up.

            If we weren’t able to perceive any differences at this level they wouldn’t have had problems with vertigo with the Oculus Rift. How much information can be gleaned from those response times very much remains up to each individual. Gnash away bro.

      • xeridea
      • 6 years ago

      But with the extra cost of the FPGA they couldn’t afford to put it on quality monitors.

      • ptsant
      • 6 years ago

      There is no need to sacrifice refresh rates or GTG response when buying a non-TN panel. EIZO makes the FG2421, which offers a VA panel with 120Hz AND strobelight (which they call “Turbo 240Hz”). Plus it has ultra-low input lag (a major issue for competitive gaming).

      Obviously, you have to afford this kind of premium monitor. But the day they make a FreeSync (ie VESA DP standards compliant) version, I will beg them to take my money.

      The only advantage of TN is price. Never again.

        • Krogoth
        • 6 years ago

        CRTs are still superior to LCDs in speed and color accuracy. They were discontinued because of eWaste (CRTs use a ton of heavy metals) and LCDs were good enough for the vast majority of the people out there. They don’t care about super fast speeds or need super accurate color representation.

        Input lag isn’t a major factor in competitive gaming. It is all about image and information clarity. That’s why progamers set in-game details to low, remove all extra lighting effects and set brightness to max. It allows the progamers to optimize their situational awareness.The “super-fast” reflexes are really progamers anticipating their opponents actions seconds before they do it. They get most of this from practice and experience (progamers spend hours memorizing maps and sparring each other each)

          • Pez
          • 6 years ago

          Excellent post and all true. I still keep an Iiyama Vision Master 22″ CRT for serious Quake sessions and it’s miles superior to any LCD.

    • derFunkenstein
    • 6 years ago

    This would probably be more memory intensive if the driver dumps the frame buffer after it’s drawn once, but it might be to their benefit to hang onto it so the driver can re-send it to the monitor if the next frame isn’t ready after X ms.

      • Airmantharp
      • 6 years ago

      See, this is what I thought they were doing :/

        • Terra_Nocuus
        • 6 years ago

        I thought the G-Sync module had a ~700MB buffer for when the pixels need a refresh & the GPU’s not ready

          • sschaem
          • 6 years ago

          It does but they cant refresh fully async to the game engine.

          They force a refresh if the gsync module did not receive a new frame in 33ms.
          But the gsync module as to wait for 33ms in case a new frame is received.

          If it doesn’t and start a refresh after 16ms, it will display an old frame, just when a new frame is being delivered.

          BTW, this also hint that if the game send a frame just after this automatic refresh, you will get delay in the display in the frame sent by the GPU… Like a micro stutter

          For 4K and game running close to 30fps, this might be measurable.

          Game could probably take this into account, so if they know they render at 31fps, they could run at 62fps with alternate duplicated frame (easier to say the do). Stopping the potential gsync ‘flicker’ and micro stutter.

          • Lans
          • 6 years ago

          I have no idea how G-Sync really works but I am not sure if the G-Sync module would really need its own buffer given there doesn’t seem to be any flickering for old fixed rate refresh monitors when you are getting very low FPS.

            • MathMan
            • 6 years ago

            In fixed rate refresh monitors, the monitor never has low FPS: it’s fixed at 60Hz. If the GPU renders at low FPS, it will resend the same frame or a teared frame to the monitor at 60Hz.

            • Lans
            • 6 years ago

            Precisely my point… I assumed people here would get FPS is a product of GPU from all the reviews we been reading…

        • derFunkenstein
        • 6 years ago

        That’s kind of how it reads, but they must be redrawing a black frame if it’s blacking out the pixels to prevent them from going white.

          • JustAnEngineer
          • 6 years ago

          LCD panels are clear when powered down, allowing the white backlight to show through. If the TN pixels fade really quickly, they let the white through.

            • derFunkenstein
            • 6 years ago

            Then their fancy module should refresh the screen more often if it’s not getting more stuff to do.

            • Meadows
            • 6 years ago

            Works fine as it is. Pixel persistence seems to be the problem. There’s nothing wrong with 30 Hz in itself but the display technology could probably use a facelift. Anyway, 30 fps is critically low so it’s a good indication that you need to change some settings.

            I also don’t believe this flicker to be a deal-breaker even if it gets to be noticeable.

            • MathMan
            • 6 years ago

            No, that’s not true.

            When TN fades it goes to a transparent state. IPS is just the opposite.

            • Meadows
            • 6 years ago

            That’s exactly what he said.

            • MathMan
            • 6 years ago

            He said pixels become clear for LCD panels when no being refreshed/powered down. That’s too general: it’s only true for TN, not for IPS.

    • nanoflower
    • 6 years ago

    Could they change that redraw rate? Perhaps a faster redraw rate when there is a stall in incoming frames would alleviate the problem.

      • DPete27
      • 6 years ago

      33ms is equivalent of 30Hz (which IIRC is the lowest refresh rate of G-Sync). The white fade must be pretty abrupt if users are noticing this.

        • Westbrook348
        • 6 years ago

        To see that graph of brightness spike from 1.009 to 1.027% during that 33ms is crazy… Obviously an amplitude of ~1% is fine (that’s what we’re seeing at 144Hz), but the larger 2% variability at 30Hz makes a much bigger difference. Maybe Nvidia should consider a faster minimum refresh of 25-30ms. There seems to be a lot less flickering at those rates; at what point does it vanish? Perhaps if someone with a G-sync monitor could graph the brightness amplitude as a function of frame times, we could find the sweet spot for minimum refresh rate to avoid subjective flickering.

    • beck2448
    • 6 years ago

    Frame rate to zero? Maybe a coding problem as well?

      • auxy
      • 6 years ago

      Do you play games? It’s very common for games to stall the render thread while it waits on loading or things like screenshot capture.

      It may be bad programming, especially in this era of ubiquitous multiprocessing, but it’s also standard practice.

      • sschaem
      • 6 years ago

      If there is nothing new to render, its actually good practice. It saves power.

      Not a big deal on desktop, but great to implement if the engine will ever run on mobile devices.

      Stall, usually are caused from the API when a game engine naively try to stream data.
      At some point even Microsoft answer to this was “Load all the data for the level before the game start”, “do not load data during game play”

      Dx12 should finally solve all this , and then the blame will fully be on game developers.

        • UnfriendlyFire
        • 6 years ago

        On the flip side, there are games that will allow over 500 FPS (including in menus), which requires a FPS limiter software to keep the FPS sane.

        I know for certain that TF2 and Starcraft 2 have this kind of issue.

        In fact, SC2 caused some users’ GPUs to approach overheating if they stayed in the menu too long because SC2 would tell the computer to render the max amount of frames.

      • Farting Bob
      • 6 years ago

      According to ubisoft, zero FPS makes games seem like art on a canvas. AC: Unity is a good example of this.

      • ptsant
      • 6 years ago

      Zero frame rate implies a still image for eternity. Anyway, close-to-zero frame-rates are the norm with office tasks and the like, so there should be a way of gracefully handling those. Laptops and movies are also a low-fps scenario, where maintining 25-30fps would be a way of saving energy and avoiding motion blur.

        • Meadows
        • 6 years ago

        The problem is whether pixels can persist between so few refreshes, which is where display technology comes in.

Pin It on Pinterest

Share This