Blind test suggests gamers overwhelmingly prefer 120Hz refresh rates

Most LCD monitors update the display at 60Hz—once every 16.7 milliseconds. Faster 120Hz displays have been around for a while, though, and I’ve always been curious about how their higher refresh changes the user experience. None of the 120Hz displays I’ve used have been set up for a proper apples-to-apples comparison, so I haven’t gotten a good sense of the difference.

Cue the folks at Hardware.Info, who set up a blind test to determine whether gamers prefer playing at 60 or 120Hz on otherwise identical systems. The results were pretty conclusive: 86% preferred the 120Hz setup. Impressively, 88% of the subjects were able to correctly identify whether the monitor was refreshing at 60 or 120Hz.

Those who preferred gaming at the higher refresh rate reportedly described the experience as smoother and more fluid. Their improved experience didn’t consistently lead to better kill-to-death ratios, though. A higher refresh rate probably won’t augment your mad skillz.

The test was sponsored in part by AOC, whose 120Hz displays were used in the systems and given away as prizes. Asus, another maker of 120Hz displays, also supplied gear for the systems. While both companies have an interest in promoting 120Hz displays, the test appears to have been a fair one.

120Hz displays are pricier than their 60Hz counterparts, and there are other caveats attached. You need a pretty fast system to pump out frames at 120 FPS, and you may need to live with TN panel technology. Apart from "overclocked" Korean LCDs selling on eBay, we haven’t seen any IPS monitors advertise refresh rates above 60Hz. I spend too much time in Photoshop to give up true 8-bit color on my desktop display, but I wonder whether gamers would prefer a higher refresh rate to accurate color reproduction. More blind testing is required.

Comments closed
    • ColeLT1
    • 6 years ago

    Last night after reading this I decided to finally “overclock” my Dell 2408wfp to a higher refresh rate. Got a massive 62Hz, 63 to 75 just jumbles/fuzzes the screen, and higher just won’t display. I guess a gain is a gain.

    • draconian
    • 6 years ago

    I overwhelmingly prefer 120 fps to 60. Even 80 fps just “feels” too choppy.

    • Deijya
    • 6 years ago

    The troll in me screams, “Screw you all, I’m getting a 900hz 3D capable plasma TV!”

    • internetsandman
    • 6 years ago

    I’d love a 120hz display but I’m not willing to give up a 1440p resolution and I sure as hell don’t wanna buy from a questionable international retailer and have to overclock it myself. Maybe once 4k displays start being the norm well see 120hz 1440p monitors, but right now it’s just the one thing on my upgrade list confined to the fantasy world

    • StashTheVampede
    • 6 years ago

    One game where having the additional FPS matters: Quake3 (possibly the previous, but Q3 was heavily documented). Somewhere around 125FPS is where a large chunk of competitive players wanted to be: you moved (in a vector) faster and weapons could fire faster. Doom3/Rage had a cap at 60Hz to stop “hardware” exploits.

    The world of uber-twitch FPS titles is mostly gone, but there were a few where having the additional frames, mattered.

      • Prestige Worldwide
      • 6 years ago

      The frame rate also affected the user’s jump height and fall damage. This translated into other Quake 3 based games such as COD4.

      For example, jump height increased with higher FPS rates, and fall damage decreased with higher FPS rates.

      COD4 Jump Height:
      [quote<] 63 FPS or lesser- jump only to 39.5 units max (exception are 52 and 55 FPS. Those values can used like analog of 71 FPS) 71 FPS. Character jump to 40 units stably. 40.5 units is impossible for this value. Same results for 76 FPS 83 FPS, 90 FPS and 100 FPS are bad again - 39.5 units max 111 FPS - 40 and 40.5 units are overcame 125 FPS - 41 height unit overcame 142, 166 and 200 FPS - strange fall again. Character overcame 39.5 units max 250 FPS - 42 units line overcame 333 FPS - same is for cod 2, with 333 fps you can jump at 46 units. 500 FPS - bugged value again. Character runs silently and jumping only at 35 units if you stay close and jump. But If u go back and jump, your jump will be about 39 units 1000 FPS - fully similar with 500 fps [/quote<] COD4 Jump Damage: [quote<] Frame rate (Height) - Damage --------------------------------------------- 71 (140) - 4 91 (140) - 8 100 (140) - 6 110 (140) - 4 125 (140) - 2 200 (140) - 6 250 (140) - 1 333 (140) - 0 ------------------ 71 (200) - 39 91 (200) - 44 100 (200) - 41 110 (200) - 38 125 (200) - 35 200 (200) - 41 250 (200) - 34 333 (200) - 22 ---------------------- 71 (300) - 96 91 (300) - dead 100 (300) - 98 110 (300) - 94 125 (300) - 91 200 (300) - 98 250 (300) - 88 333 (300) - 72 ------------------------ 71 (350) - dead 91 (350) - dead 100 (350) - dead 110 (350) - dead 125 (350) - dead 200 (350) - dead 250 (350) - dead 333 (350) - 97 [/quote<]

      • Krogoth
      • 6 years ago

      You realize that higher frame render has nothing to do with the monitor in Quake 3? The only reason it was sought after is because the game’s physics engine both server/client were tied to the framerate. A high framerate gave clients a slight, but noticeable advantage in jumping and movement over clients that had a modest framerate. It was very bad with the earlier versions of Quake 3. It was mostly “fixed” with 3.20, but the problem still happens if you go beyond 200-300FPS rendering.

    • Firestarter
    • 6 years ago

    [quote=”Wikipedia”<]A blind or blinded experiment is a test or experiment [b<]in which information about the test that might lead to bias in the results is concealed from the tester, the subject, or both[/b<] until after the test. Bias may be intentional or unconscious. If both tester and subject are blinded, the trial is a double-blind trial.[/quote<] [url<]https://en.wikipedia.org/wiki/Blind_experiment[/url<]

    • ronch
    • 6 years ago

    This ain’t no blind test. If you really wanna do blind testing, the testers should be blind-folded and asked whether 120Hz makes a difference.

    • npore
    • 6 years ago

    Hello TR peeps. Long time lurker here. Thought I should post this as I didn’t see it mentioned:

    [url<]http://www.tftcentral.co.uk/articles/motion_blur.htm[/url<] Basically, going from 60Hz to 120Hz halves motion blur, but by using a strobe backlight such as in displays supporting NVIDIA LightBoost (hacked to be active in 2D mode - usually on in 3D) motion blur is up to 92% less. CRT-like apparently. For me, as a gamer, more fluid motion/less blur is way more important vs the more accurate colour reproduction you can get in then non-TN panels. I hear these 120Hz 1440p IPS Korean monitors are better than 60Hz, but don't have fast enough pixel response time vs the TN 120Hz panels, and don't have strobe backlights to further eliminate blur. I'll be in the market for a new monitor at the end of the year, but unfortunately looks like I have to step down to 1920x1080 from my current 1920x1200. I will sorely miss those 120 vertical pixels. 16:9 ain't so bad once you get to 1440; unfortunately there aren't any LightBoost 1440p monitors yet..

      • mdrejhon
      • 6 years ago

      Agreed — refresh-synchronized strobe backlights do a great job of eliminating motion blur. Several 120Hz and 144Hz monitors have a motion-blur-eliminating strobe backlight including ASUS VG248QE, VG278H, VG278HE, as well as BENQ XL2411T, XL2420T, XL2420TE, XL2720T (see [url=http://www.blurbusters.com/faq/120hz-monitors/<]supported 120Hz monitors[/url<]) [quote<]CRT quality motion on LCD, with less motion blur than Sony FW900 CRT's [i<]"The backlight is turned off while waiting for pixel transitions (unseen by human eyes), and the backlight is strobed only on fully-refreshed LCD frames (seen by human eyes). The strobes can be shorter than pixel transitions, breaking the pixel transition speed barrier! In addition, it eliminates the sample-and-hold effect."[/i<] -- [url=http://www.youtube.com/watch?v=hD5gjAs1A2s&feature=player_embedded<]high speed 1000fps video[/url<][/quote<] LightBoost motion blur elimination is apparently a major selling feature of 120Hz monitors now -- the [url=http://www.amazon.com/product-reviews/B00B2HH7G0<]customer reviews[/url<] of the Amazon VG248QE computer monitor (hit Control+F and find word "lightboost" in all the reviews) show that a LightBoost backlight has become a major selling feature. Manufacturers need to take notice. They have been recently given [url=http://www.blurbusters.com/zero-motion-blur/media<]media coverage[/url<] (ArsTechnica, AnandTech, TFTCentral, pcgameshardware.de, etc) as well as a lot of forum posters have posted [url=http://www.blurbusters.com/zero-motion-blur/testimonials<]stunning testimonials[/url<] about them. This is a worthy development that need to become a feature of future 120Hz monitors including IPS too. Also, there is now an easy utility called "ToastyX Strobelight" that makes it easy to turn on/off LightBoost for 2D gaming. It can be enabled via a hotkey.

      • mdrejhon
      • 6 years ago

      Also, good photo comparision: [url=http://www.blurbusters.com/faq/60vs120vslb<]60Hz vs 120Hz vs LightBoost[/url<]

        • Firestarter
        • 6 years ago

        I think it bears repeating that these photos actually show the effect that your eyes see, because they used a tracking camera setup that tracks the object like your eye would. It’s a really good demonstration of the dramatic effect that lightboost can have.

    • jensend
    • 6 years ago

    To those of you saying “hey I could tell the difference with higher refresh rates on my CRT so obviously MOAR HURTZ == BETT0R,” your experience with a CRT’s vertical scan rate is completely irrelevant to the frame rate of an LCD.

    Phosphor persistence in CRTs is generally on the order of tens of microseconds; any given part of the CRT display is completely dark well over 99% of the time. ([url=http://www.youtube.com/watch?v=zVS6QewZsi4<]Here's a high-speed video[/url<]; even that is still getting a lot of "persistence of vision" type effect from the high-speed camera's exposure times being so much longer than the time the phosphors are actually bright.) So yes, our eyes and brains can in many circumstances (esp. w/ large viewing angles, reasonable ambient light) easily tell when they're getting very short pulses of bright light with >16ms of darkness in between. The flicker fusion threshold- the point at which the flashes are perceived as an unproblematic continuous image- can be over 100 Hz. (I personally couldn't sit in front of a 60Hz CRT for more than a minute without getting very irritated and often getting a headache.) The light of an LCD is constant. Discerning frame changes in continuous motion is completely different from discerning bright flashes in otherwise total darkness. There is plenty of good science behind the assertion that >50fps rates make no significant difference [b<]for real-world video[/b<], ranging from over a century of perceptual testing to our modern understanding of the reactions that go on in rods and cones. Claims to the contrary are composed entirely of pseudoscience, magical thinking, and confirmation bias. Pecularities of video game programming cause somewhat higher frame rates to be helpful [b<]for games[/b<]. The two most obvious reasons are timing interactions (input lag, refresh/scanout, etc) and the temporal aliasing caused by normal renderers' "infinite shutter speed" i.e. lack of exposure blur. (Of course inconsistent frame times are another important difference between games and video. Most of the reason people are still stuck on the idea of ultra high FPS is because they still haven't internalized the lessons of "inside the second." Beyond the flimflam and the hype, the main reason seeing your FPS counter hovering in the stratosphere ever received positive associations in gamers' minds is because it generally comes with fewer >33ms frames.) How much higher of a consistent frame rate will continue to give benefits in a game? We don't have adequate good tests to say. The only game-focused test of I'm aware of is [url=http://web.cs.wpi.edu/~claypool/papers/fr-rez/paper.pdf<]this paper[/url<]. Note the small differences in both perceived quality and player performance between 30 and 60 fps- the error bars overlap significantly. (Note that these are consistent 33ms and 16ms frames, not just averages.) This Hardware.info test was badly biased by extreme tearing due to no vsync and >140fps framerates, and has other problems too. An unbiased, properly set up test would have vsync enabled, each test subject would do multiple tests with the same monitor (isolate the variable), the refresh rate would be selected randomly for each test (double blind), and you'd figure how unlikely their success rate would be under random guessing. ([url=http://xcorr.net/2011/11/20/whats-the-maximal-frame-rate-humans-can-perceive/<]BTW here's another piece on the subject.[/url<])

      • mdrejhon
      • 6 years ago

      Here’s some scientific links. Most motion blur on LCD is caused by the sample-and-hold effect. That’s why CRT 60fps@60Hz has less motion blur than LCD 120fps@120Hz (non-LightBoost). So raising the Hz has far more benefit on LCD than on a CRT, because of the sample-and-hold effect (frames continuously shining for the whole refresh). It’s also the same reason [url=http://www.blurbusters.com/faq/oled-motion-blur/<]why some OLED's have motion blur[/url<], too as well -- they are continuously shining too between refreshes. When your eyes are tracking moving objects on a screen, your eyes aren't in the same position at the beginning of a refresh as at the end of a refresh. So if the frame is continuously shining for the whole refresh, the frame is blurred across your retinas as your eyes are tracking. This is the motion blur caused by sample-and-hold. Here are scientific references: “[url=http://msdn.microsoft.com/en-us/windows/hardware/gg463407.aspx<]Temporal Rate Conversion[/url<]” (Microsoft Research) [i<]Information about frame rate conversion, that also explains how eye tracking produces perceived motion blur on a sample-and-hold display, including explanatory diagrams.[/i<] “[url=http://www.google.com/#hl=en&tbo=d&output=search&sclient=psy-ab&q=Correlation+between+perceived+LCD+motion+blur+and+MPRT+measurement<]Correlation between perceived motion blur and MPRT measurement[/url<]” by J. Someya (SID’05 Digest, pp. 1018–1021, 2005.) [i<]Covers the relationship between human perceived motion blur versus Motion Picture Response Time (MPRT) of the display. This also accounts for motion blur caused by eye tracking on a sample-and-hold display, a separate factor than pixel persistence.[/i<] “[url=http://reshal.ru/wp-content/uploads/2011/11/3-CRT-LCD.pdf<]What is needed in LCD panels to achieve CRT-like motion portrayal?[/url<]” by A. A. S. Sluyterman (Journal of the SID 14/8, pp. 681-686, 2006.) [i<]This is an older 2006 paper that explains how scanning backlight can help bypass much of an LCD panel’s pixel persistence. [/i<] “[url=http://itg32.hhi.de/docs/ITG32_Sony_07_2_185.pdf<]Frame Rate conversion in the HD Era[/url<]” by Oliver Erdler (Stuttgart Technology Center, EuTEC, Sony Germany, 2008) [i<]Page 4 has very useful motion blur diagrams, comparing sample-and-hold versus impulse-driven displays.[/i<] “[url=http://www.mpi-inf.mpg.de/resources/3DTemporalUpsampling/3DTemporalUpsampling.pdf<]Perceptually-motivated Real-time Temporal Upsampling of 3D Content for High-refresh-rate Displays[/url<]” by Piotr Didyk, Elmar Eisemann, Tobias Ritschel, Karol Myszkowski, Hans-Peter Seidel (EUROGRAPHICS 2010 by guest editors T. Akenine-Möller and M. Zwicker) [i<]Section “3. Perception of Displays” (and Figure 1) explains how LCD pixel response blur can be separate from hold-type (eye-tracking) motion blur.[/i<] “[url=http://research.nokia.com/files/bergquist_johan_seminar_sidtw_071219.pdf<]Display-induced motion artifacts[/url<]” by Johan Bergquist (Display and Optics Research, Nokia-Japan, 2007) [i<]Many excellent graphics and diagrams of motion blur, including impulse-driven and sample-and-hold examples.[/i<] ____________________ Finally, there are some LCD's that use a strobe backlight that behaves (to the human eye) more like a CRT: [url<]http://www.blurbusters.com/zero-motion-blur/video/[/url<] "The backlight is turned off while waiting for pixel transitions (unseen by human eyes), and the backlight is strobed only on fully-refreshed LCD frames (seen by human eyes). The strobes can be shorter than pixel transitions, breaking the pixel transition speed barrier! In addition, it eliminates the sample-and-hold effect." ([url=http://www.youtube.com/watch?v=hD5gjAs1A2s<]high speed video[/url<]) It apparently works well, according to many people: [url<]http://www.blurbusters.com/zero-motion-blur/testimonials/[/url<] A lot of great reviews found on amazon.com VG248QE customer reviews: [url<]http://www.amazon.com/product-reviews/B00B2HH7G0/[/url<] (Hit Control+F and find word "lightboost" in all the customer reviews) As you can see from the scientific references, motion blur on displays is essentially dictated by the length of time the frame is displayed for (sample-and-hold motion blur). So a 60Hz CRT that strobes the image for 1ms (phosphor illuminate-and-decay cycle) will always have less motion blur than a regular 120Hz LCD because it shines the image for a full 8.3ms (1/120sec). The only way to reduce motion blur is to increase the Hz (sample-and-hold displays), or add black periods between Hz (flicker displays). This is why increasing Hz has a bigger effect on LCD than it does on CRT, because the increasing of the Hz reduces the sample-and-hold effect. Also, videogames benefit more than video because of the sharp edges. Video often have softer focus, and can be prone to source-based motion blur (e.g. camera shutter, overcompression), so video games benefit far more from these improvements. Also, in the high-definition era, small amounts of motion blur can become more easily noticed, since edges are so sharp, that slight amounts of motion blur begins to become noticed.

    • Jakubgt
    • 6 years ago

    A few months back I purchased a Korean IPS 1440p monitor (Qnix). Little did I know these bad boys can be overclocked. I managed to get mine up to 120hz on a GTX 760 and you actually can notice a nice difference. Some people have gotten over 130hz+, but I haven’t tried going past 120hz.

    • yammerpickle2
    • 6 years ago

    I like my Aus 23″ 1920x 1080 120 hz monitor for gaming. I bought it quite a while ago, and while there was a slight price premium for the refresh rate, it was not that much of one. Yes, the color is not perfect, but if your are video or photo editing person you probably will be buying a different class of monitor. What I’m really looking forward to is at least 40″ 4K OLED 120 Hz monitor. I’ve been saving up so I’m ready when they hit.

    • Cyco-Dude
    • 6 years ago

    no, more blind testing is not required; this is kind of a “no duh” test, to be honest. even 2ms tn panels show a lot of ghosting, which the 120 panels mitigate quite nicely. go look at any 120 panel review at x-bit labs; this is old news.

      • travbrad
      • 6 years ago

      Well part of it is that a lot of “2ms” panels aren’t really 2ms. Those are manufacturer’s claimed response times, not independently tested. Overdrive usually has a bigger impact on ghosting than response times anyway. My last 3 monitors have had 2ms, 5ms, and 8ms response times. The 8ms one has noticeably less ghosting than the “2ms” or “5ms” ones.

      It’s true that 120hz monitors generally have lower response times and less ghosting, but even among 120hz monitors there are some which show a significant amount of ghosting.

    • DarkUltra
    • 6 years ago

    RTS games doesn’t need high fps as the cursor is written to the framebuffer at display refresh rate. I’d like to see how many people would notice the smoother cursor in rts games and on hhe desktop.

    I did a small blind test with my sister, brother and a friend and they could all see a difference.

    • travbrad
    • 6 years ago

    I don’t think anyone doubted that 120hz is a smoother gaming experience, the question is really whether it’s worth the cost (since 120hz monitors are usually at least 2x as expensive as their 60hz counterparts). You can also get a 1440p display for about the same cost as a 120hz 1080p display, which gives you A LOT more pixels.

    There is also the issue of needing more expensive hardware to run near 120FPS, and/or compromising quality settings. TN isn’t great for non-gaming tasks either.

    Really the ideal situation would be to have both a 120hz display AND an IPS display, because neither of them is perfect for everything.

    • danny e.
    • 6 years ago

    My color laser printer refreshes @ .5 Hz. Much better than my old ink jet which was about .1Hz.
    Still, gaming on the laser printer is cumbersome as there’s a tremendous amount of lag.

    On the plus side, if I ever want to brag about my skillz I gots the screenshots all piled up!

      • Generic
      • 6 years ago

      I don’t know if you’re aiming for anything beyond pure absurdity, but I enjoyed every bit of it. 🙂

    • jessterman21
    • 6 years ago

    Blind test suggests drivers overwhelmingly prefer 60mph to 30mph

    • Chrispy_
    • 6 years ago

    120Hz = Good.
    TN = Bad

    Why can’t we get >60Hz in IPS or PLS yet? Even 85Hz or 100Hz makes a dramatic difference.

    What AOC didn’t do was a blind test between an IPS and their nasty cheap TN screens. I’d like to have seen the results of that 😛

    Don’t forget, you also need double the graphics horsepower to run at 120Hz, which means probably spending quadruple since costs don’t scale linearly and that’s why the sweet spot [i<]isn't[/i<] a GTX Titan.

      • brucek2
      • 6 years ago

      Yes, this is the blind test we need. Of course 120hz is better than 60hz. I’d rather the test address the choice actual buyers looking for a new monitor have today — for equivalent budgets, what are the ways to go and which do people end up preferring?

      Or if there’s just no way to make the budgets square because of the extra gpu power, then how much is the difference and how do people perceive the cost/benefit trade off there?

    • kamikaziechameleon
    • 6 years ago

    Why don’t we have a 120 hz IPS or IPS caliber monitor?

      • Pettytheft
      • 6 years ago

      [url<]http://overlordcomputer.com/[/url<] You can try and get one of these but they tend to sell out quick. I ended up getting a one off ebay and waiting until 120Hz became official.

      • DarkUltra
      • 6 years ago

      Not enough demand. Spread the word about IPS and 120hz superiority and help create a market. Thank you.

        • Derfer
        • 6 years ago

        Having had one of these overclocked monitors I can tell you it doesn’t really work. You can set the speeds but you trade off color information. There simply isn’t enough bandwidth to do both, and the IQ suffers significantly. No calibration profile can really fix it. If you go straight from TN to the overclock you might not notice so much, but if you take it down to 60 hz for awhile you won’t be able to stand overclocking it anymore.

      • squeeb
      • 6 years ago

      I’ve been asking myself this for years since I got my 120Hz screen.

    • Bensam123
    • 6 years ago

    Yup…

    I imagine the hate for 120hz will die out with time. This includes people who seem to think there isn’t a difference or isn’t a big difference between 60hz and 120hz. People simply haven’t been exposed to the right content or had a chance to get used to it yet.

    From my experience I don’t think I could handle a monitor less then 120hz after using my VG248QE. Perhaps this is more of a gamer oriented thing, but I would and did trade color representation and pixels for speed. I would continue to do so too. It’s just a simply superior experience as far as fluidity goes.

    This is also why I’ve been urging TR to shift to 120hz testing for their graphical benchmarks. Hz rate will continue to climb just like resolution will and most gamers would probably take the hz over the extra pixels if given a choice. That said I also believe games, hardware, drivers, and software are still a bit buggy when it comes to 120hz or anything over 60hz, which is why it’s important to test it. We’ve been stuck on 60hz for so long people no longer test anything outside of it. The WDDM 1.1 dual monitor bug being one of them.

      • Farting Bob
      • 6 years ago

      I havent seen any ‘hate’ for 120hz as a technology anywhere, but a lot of people do not like the current batch of 120hz monitors because they are all TN. A good quality IPS 120hz monitor would be a wonderful thing if i could get one cheap enough, but right now of the current options, i’d take a cheaper, larger TN over an IPS.

        • Airmantharp
        • 6 years ago

        I prefer IPS (or VA/PLS if not used for serious gaming), but if you could get me a high-resolution TN that has decent black levels, decent viewing angles, and colors that can be calibrated, I think I could find that acceptable.

        I currently have 5 monitors hooked up to my gaming system (because I can!), and only one is IPS; the other four are TNs of varying quality, and only the oldest, a Samsung 204T, is better than average. I still have no problems aiming them all at my seating position and using them for data readouts, web pages, system monitors, Ventrilo, or streaming video.

        • Bensam123
        • 6 years ago

        Hate as in they describe it as hogwash or a placebo because ‘the human eyes can’t possibly perceive faster then 60 fps’, so there is no reason to have a faster refresh. Then go on to rant about what terrible panels TNs are and how they should be purged from this world, negating to mention that TN panels definitely help take advantage of a 120hz or faster refresh.

      • DarkUltra
      • 6 years ago

      That goes for me too, id take a 120hz tn panel over a 60hz any day, both for desktop and gaming. The mouse cursor, window movement and metro animations are very solid looking.

      Here are some of the games and apps I’e encountered that does not work well or at all in 120hz mode

      [url<]http://jooh.no/index.php/2012/05/12/lifting-the-60fps-limit/[/url<]

      • npore
      • 6 years ago

      +1, would love to see TR looking at 120Hz. Admittedly it’s still niche given the hardware required to keep games at that frame rate. I think CPU bottlenecking starts to really kick in when you are getting to sub 10ms frame times; CPU overclocking could be quite important.

    • evilpaul
    • 6 years ago

    Back when CRTs were still a thing (or if I’m at a friend’s place where they’re still using one) I can always immediately tell if it’s running at 60Hz.

      • Mikael33
      • 6 years ago

      Most can, you’re not special.

    • south side sammy
    • 6 years ago

    even in a non gaming situation I always found 120 to be better. But get on forums somewhere there’s always a douche bag that has to argue because they don’t know the difference and think ” because your eyes can’t see”…………… where did they come up with that? It’s more than what you can see.

    when they have 1920×1200″ for $300 I’ll invest in one.

    • odizzido
    • 6 years ago

    I’ve used a 120hz LCD monitor and I quite like it…a lot….actually I used to run my CRT at 100hz so I lost smoothness when going LCD, but LCD is so much easier on my eyes.

    • Star Brood
    • 6 years ago

    There is a $150 120Hz monitor on NewEgg, for what it’s worth:

    [url<]http://www.newegg.com/Product/Product.aspx?Item=N82E16824254111[/url<] No reviews 🙁 In B4 replies "Yawn, wake me when it's $X/Hz"

      • yogibbear
      • 6 years ago

      Yawn, wake me when it’s 6 goats / orange.

        • odizzido
        • 6 years ago

        Oranges are super cheap…even if the goats just showed up at someones door, transporting them would take way too much time and money for six of them to be worth the same amount as an orange.

        • bthylafh
        • 6 years ago

        Yawn, wake me when it’s 2 girls / cup.

      • A_Pickle
      • 6 years ago

      It’s also a Hanns-G. Never buying from them again.

        • LukeCWM
        • 6 years ago

        Why not? What was your experience?

        I’m not discrediting you, just curious. I’m never buying from NZXT again. =]

        • bthylafh
        • 6 years ago

        I haven’t had any especially /bad/ experiences with them; sample size is only about 3 or 4, though. They’re not amazing by any stretch, but OK for the price.

          • Airmantharp
          • 6 years ago

          My Hanns-G 27.5″ 1200p panel is ugly as sin, but it is bright and I have been able to calibrate it for semi-useful photo editing. It’s the extremely shallow (as opposed to ‘deep’) blacks and gobs of ghosting of dark objects that gets to me.

          But otherwise, it performs exactly as expected for the price; and it is a bright, sharp TN with very large pixels. That counts for something right?

      • BoilerGamer
      • 6 years ago

      Wake me up when it is IPS, or OLED.

    • hoboGeek
    • 6 years ago

    “Once you go one twenty hz, going back it really hurts”

    • jensend
    • 6 years ago

    Surprise! A contest heavily sponsored by a manufacturer of 120Hz displays resulted in people saying 120Hz was better. Guess why? They disabled vsync even though the minimum frame rate was 140 fps, so of course they got a lot of tearing at 60Hz.

    Always, always, always enable vsync if your setup can maintain a higher frame rate than your monitor’s refresh rate. For a hardware specialty site to not take this obvious advice when setting up such a test is outright dishonest.

      • Prestige Worldwide
      • 6 years ago

      Never, ever, ever use vsync if you care about input lag. It’s the difference between getting a frag or being teabagged.

        • jensend
        • 6 years ago

        OK, “always always always” may have been an exaggeration since some games are poorly coded and might have noticeable increases in input lag from vsync. But on most games the increase in input lag will be slim to nil, especially if your hardware is reliably able to produce frames in time and you disable the flip queue (“maximum pre-rendered frames=0” or the equivalent in your driver).

      • bcronce
      • 6 years ago

      Tearing is an artifact of “not fast enough”. All vsync is good at is making the game choppier by being an artificial limitation.

      But yes, I agree that it would have been a better test to have vsync enabled because tearing is a dead giveaway. But on the other hand, tearing is a large reason to use 120hz in the first place.

        • xeridea
        • 6 years ago

        Tearing is not an artifact of “not fast enough”. It will happen, no matter how many FPS you get. It is not an artificial limitation, it is to prevent tearing. If your framerate is really low, perhaps it should be disabled, so you can get a few more FPS, but have tearing.

    • JDZZL
    • 6 years ago

    “More blind testing is required.”

    Come on, you’re just going to softball it in like that?

    • brucethemoose
    • 6 years ago

    [quote<] We disabled Vsync.[/quote<] There's the important bit. With minimum framerates around 140-150fps and no vsync, 120hz should look a whole lot better than 60hz. 60hz refresh + 200 FPS = screen tearing galore. I've tested 110hz, 96hz, 80hz, and 60hz on my 1440p monitor, and while the difference is certainly there, it's not night and day like the switch to 1440/IPS was.

      • jthh
      • 6 years ago

      What 1440p monitor do you have?

        • brucethemoose
        • 6 years ago

        A Shimian QH-270. It was the only (and first) Korean monitor for sale on eBay back then.

      • Airmantharp
      • 6 years ago

      I hate running with Vsync disabled- I keep it enabled for any game that isn’t reaction/accuracy competitive, which for me is really only BF3 these days.

      But I’d love to run it all the time, even in BF3, and especially with a higher-refresh rate monitor. Actually, I think that Vsync is likely more effective as refresh rates increase, because the ‘stall time’ for rendered frames drops.

      Anyone have any comments on Vsync at 60Hz vs. 120Hz?

        • travbrad
        • 6 years ago

        I’m pretty much the opposite. I’ve never been able to use Vsync in any game without the input lag being noticeable and annoying. It’s worse in some games than others but it’s always a problem for me. Even in Eve-Online or RTS/TBS games trying to use all the menus and UI elements with input lag is horrendous. It seems to bother some people a lot more than others though.

        To actually answer your question, Vsync should cause less input lag at 120hz than it does at 60hz. Vsync often creates about an extra frame of input lag, so if your frametimes are halved you are going to see a lot less lag. If you want to read a long article explaining all the facets of input lag (including Vsync) [url=http://www.anandtech.com/show/2803<]Anandtech has a great article about it[/url<]

          • Airmantharp
          • 6 years ago

          Input lag vs. tearing… we’ve been making that trade-off for a long time now :).

          I agree, if it affects the cursor input, I’d probably turn it off too- but even as sensitive as I am to such things, I’m also very good at ignoring and working around them.

          I mean, seriously, I have a Pentium 4 under the desk at work that sees some use. Windows 8 on an SSD is a whole ‘nuther world, in comparison!

          • brucethemoose
          • 6 years ago

          RadeonPro’s frame limiting is alot better than Vsync (as it also smoothes out microstutter), and I think there’s an Nvidia equivalent. You should give that a shot.

            • Airmantharp
            • 6 years ago

            Pretty sure I have that on; except for BF3, but I haven’t checked lately.

            • travbrad
            • 6 years ago

            I tried Nvidia’s version and it gave me just as much input lag as regular V-Sync (at least in the couple games I tried).

      • mdrejhon
      • 6 years ago

      I should note that there’s a more dramatic difference with other [url=http://www.blurbusters.com/faq/120hz-monitors/<]120Hz monitors[/url<], than with the overclocks. The overclocks tend to have slightly more motion blur at 120Hz than the TN 120Hz because of the slightly slower pixel response of the 1440p panels versus the existing 120 Hz monitors. Also, LightBoost also increases this chasm greatly -- see [url=http://www.blurbusters.com/faq/60vs120vslb<]60Hz vs 120Hz vs LightBoost[/url<]. Here's an [url=http://hardforum.com/showthread.php?p=1039591537#post1039591537<]example post[/url<] from a person who owns both an IPS monitor and a LightBoost monitor: ____________________ [quote<]So I finally got the VG248QE hooked up last night and was able to play around with it for a couple hours. The other monitor that I have is a HP ZR30W which is a 30" 2560x1600 IPS monitor so I will be comparing the VG248QE to that a lot in this review. Right off the bat, I noticed the color quality seems to be a lot worse than the ZR30W. Everything looks to be washed out, dull and not to mention the monitor suffers from poor viewing angles. On the ZR30W, there is next to no color shifting when I move my head around unlike the VG248QE, but that's a common problem with all TN monitors. I tried calibrating the monitor a little bit using some of the values posted online, but it still doesn't compare to the HP. Moving on, the first thing I tried was 144 Hz gaming. I loaded up Borderlands 2 just to see how it is and I can definitely say it felt smoother. There is no screen tearing at all on the ASUS, unlike how it is on the HP if i don't turn on Vsync. Although the game felt smoother at 144 Hz and there was less blurring, I found that having to play on a lower res (1920x1080 vs 2560x1600) and poorer color reproduction made the overall gaming experience WORSE. Granted this isn't a competitive, online FPS game so I might have benefited more from having a faster refresh rate, but I would have probably stuck with playing this game on the 30" IPS monitor rather than a 24" TN. [b<]At this point I felt like I may have wasted $300 bucks on a monitor that is full of compromises.[/b<] The next thing I tried of course was using the Lightboost hack. This was the main reason why I bought the monitor in the first place since there are plenty of other 120 Hz monitors that I could have gotten that I'm sure had better color reproduction. So I downloaded the hacked INF file and followed Mark's instructions. After turning on Lightboost, I noticed the monitor became a little bit brighter so I loaded up PixPerAn just to verify everything is working. The first thing I noticed was that I can actually read "I need more socks" at full speed! This was cool since I've never been able to read it going so fast before on any LCD monitor. I then proceeded to load up Borderlands 2 again not having much expectations. The first thing that happened was I noticed the FPS drop down to around 1-2 fps, but then I remembered to hold down "Ctrl-T" for a few seconds to turn off the 3D effect which fixed the FPS problem. So I loaded up a game and the first thing that came to my mind was... [b<]SWEET MOTHER OF GOD![/b<] [b<]Am I seeing this correctly? The last time I gamed on a CRT monitor was back in 2006 before I got my first LCD and this ASUS monitor is EXACTLY like how I remembered gaming on a CRT monitor. I was absolutely shocked and amazed at how clear everything was when moving around. After seeing Lightboost in action, I would have gladly paid twice the amount for something that can reproduce the feeling I got when playing on a CRT. Now I really can't see myself going back to my 30" 2560x1600 IPS monitor when gaming. Everything looks so much clearer on the ASUS with Lightboost turned on.[/b<] If you do any kind of gaming, you should definitely get this monitor. For everything else however, an IPS monitor would probably be better. Thankfully I am lucky enough to have both :)[/quote<]

        • brucethemoose
        • 6 years ago

        Now all we need is a 1440p lightboost monitor for $300 😀

        Hmmm, would it be possible to replace the backlight pf an LCD with a lightboost compatible one?

          • mdrejhon
          • 6 years ago

          I’ve trailblazed some research in the past, see [url<]http://www.blurbusters.com/category/homebrew/[/url<]

    • LukeCWM
    • 6 years ago

    [quote<]Blind test suggests gamers overwhelmingly prefer 120Hz refresh rates[/quote<] Testing blind? Seems like a silly way to compare visual-only phenomena...

    • hoboGeek
    • 6 years ago

    What better way to determine some statistics about visual qualities than a …”blind test” ? Isn’t ironic?

    • LukeCWM
    • 6 years ago

    [quote<]Those who preferred gaming at the higher refresh rate reportedly described the experience...[/quote<] Purportedly, Geoff is sick of our teasing. =D

      • Star Brood
      • 6 years ago

      From his more recent article:

      “The ECS implementation purportedly allows multiplier-based overclocking”

      Looks like he’s trolling.

        • LukeCWM
        • 6 years ago

        That’s okay. We can troll back. =]

    • Welch
    • 6 years ago

    120 is superior in that your eyes can see between 72-75 FPS, anything more than that really can’t be perceived. 60 just does not cut it for fluidity, the old CRT 60 vs 75 Hz is a great demonstration to this. So is the science “experiment” where you tie a string to a motor and run it so that it swings at a certain rpm and then run a strobe light at 60 times a second (60 Hz) then run it 75 hz and notice the string looks as though its not moving. IT SCIENCE (Bill Bye screams from out of no where).

    I’ve always been able to visual tell a 60 from 120 display. Its not always that they are doing much more than double buffering an image and delivering it twice to give the impression of fluidity. Something that the gimmicky 240hz TVs were doing to attempt to seem superior to the 120. When in reality they were still just 120hz TVs themselves.

    Keep in mind that regardless of refresh rate, if your display is 75 and higher you still need to make sure that your video card can reliably pump out a consistent FPS to match your eyes needs. It would be pointless to have a 120 hz monitor if your GPU was only putting out 30-50 fps. Also important to reduce the range of fps the card puts out, say 75-85 instead of 40-80 as an example since your eyes will pickup the difference and interpret it funky. The same reason frame latency is so crucial as TR has demonstrated to us and even the GPU manufacturing world.

    And as a side comment… AOC displays are pure junk. I’d trust an HP display over them.

      • jcw122
      • 6 years ago

      I thought CRTs didn’t have refresh rates?

        • Deanjo
        • 6 years ago

        CRTs are the devices that do have a true refresh rate. It referred to the number of time the phosphores were charged per second.

          • moose17145
          • 6 years ago

          Indeed they did. When I was running CRTs, I HAD to set my refresh rate above 60 Hz. Otherwise I could see them flickering and it would give me a serious migraine in a matter of minutes. Once bumped to about 70 or 75 Hz I couldn’t see it anymore. Once I got a LCD my eye strain went WAY WAY WAY down. LCDs do not need to refresh themselves the way a CRT does, as such the refresh rate doesn’t really matter as much in terms of eye strain. The pixels are either one color or they are not. Kind of like a one or a zero. There isn’t any REAL refreshing like there was in a CRT.

          Also I do not understand all the down votes that jcw122 got. It seemed like he was asking a legitimate question because he did not know. But I suppose the 9 people who down voted him are right. How dare he ask a question that could lead to a greater understanding of the world around him simply because you already know the answer and he doesn’t!

      • OhYeah
      • 6 years ago

      I’m curious, where do people get the idea that the human eye/brain cannot detect anything over a certain number of frames per second? The limit in the 70-100 fps range is so ridiculously low that any serious gamer can debunk it.

        • superjawes
        • 6 years ago

        I could see a cost/return argument being made. And obviously, the difference becomes [i<]less[/i<]noticeable as frequency increases. 30 to 60 is huge. 60 to 120 less so. Really, the biggest factor in "detectability" (at least in the gaming sense) is how severe the defect is. Long stutters and extreme tearing are going to be noticeable at almost any FPS/refresh rate. The only difference is that higher rates would make these defects [i<]less[/i<] noticeable.

        • odizzido
        • 6 years ago

        Maybe they found someone with really bad eyes and tested them on TV?

        Most people can pick out a single frame of a bird in the sky in a movie running at 500FPS, but I am not sure if I could tell the difference between 500FPS and 400, or even 300FPS. I’ve never tried though.

          • bitcat70
          • 6 years ago

          Do you have a source for that?

        • LukeCWM
        • 6 years ago

        <RabbitTrail>

        Along the same lines, why do so many electronics news reporters think that the human eye can’t see anything past 1080p and therefore 4k is the worst thing that could ever happen to us?

        Even if the eye can’t make out a single black pixel amongst a sea of white pixels x feet away from a y sized screen, edges and curves can still look more natural, and lines can be perceptibly thicker by being only one pixel wider. And diagonal lines can be so much smoother.

        And not everyone sits the same distance from a screen of a given size…

        </RabbitTrail>

          • Airmantharp
          • 6 years ago

          AND color gradients can be smoother because there are more pixels available to show subtle changes.

          The list goes on, really, and expands every time you change content.

        • pragma
        • 6 years ago

        Google is the friend of a curious person. [url<]http://google.com/[/url<] [url<]http://www.newscientist.com/article/dn9633-calculating-the-speed-of-sight[/url<] [url<]http://webvision.med.utah.edu/book/part-viii-gabac-receptors/temporal-resolution/[/url<]

      • superjawes
      • 6 years ago

      [quote<]It would be pointless to have a 120 hz monitor if your GPU was only putting out 30-50 fps.[/quote<] Not true, and for the same reason TR started using latency metrics. It all comes down to the recontruction of the moving image. Even if you assume that your 50 FPS is a "true" measure--all frames take the same time and are even--you are still going to get tearing, places where your monitor starts displaying a new frame mid-refresh. Even though these are brief, they can still be visible if the tear is severe enough. The benefit of a higher refresh is that these tears are significantly reduced, AND they are displayed for a shorter period of time. Even at FPS < 60, you still get a benefit. Now you could recommend a better GPU upgrade before getting a snazzier monitor (assuming you are happy with your current one), but regardless of your GPU's performance, you would see an improvement on 120 Hz for games.

        • xeridea
        • 6 years ago

        No matter how high the refresh rate, you will always see tearing. Higher refresh doesn’t reduce tearing. With vsync disabled, the screen will always tear, no matter what.

          • superjawes
          • 6 years ago

          Yes and no. In the strictest sense, yes, it still happens. However, if you’re talking about 30 FPS @ 120 Hz, that means that the tear will only happen on one out of four [s<]frames[/s<] refresh cycles as opposed to every other one @ 60 Hz. And at the same time, with higher frequencies (and FPS) the time step between in the game simulation is smaller. The worst tears happen when several "runt" frames are squeezed between two "regular" frames on a refresh cycle, so the time jump between the larger, more visible frames is quite large. At a higher refresh rate, these runt frames might get several more pixels worth of screen time, smoothing out the tear than what it could be at lower rates. So again, it still happens, but the effect can be better mitigated by refresh rate.

      • albundy
      • 6 years ago

      Since when did HP make their own displays? cuz they dont. How do you know HP doesnt use AOC displays?

      • Bensam123
      • 6 years ago

      Human ‘limitations’ like this have not been proven and are just pulled out of thin air. The old version used to be humans can’t see more then 60fps, some even mentioned 50 at one point. It keeps climbing up a notch at a time with the persons eyes who are making the claim.

      There is TONS of conflicting information on subjects like this online. Truth is, there is no ‘absolute’ as far as how fast the brain can perceive information, process information, or otherwise interpret the world around you. This applies to all of your sense. People are very different, genetically and in how they’re raised. For example even if you don’t have the genetics for a fast reaction speed, you can still train and hone yourself to have a faster one. This applies to people with naturally fast reaction speeds too.

      Neural Plasticity is a term you’re looking for here.

      Putting this aside, it’s still beneficial having a faster refresh even if the pixel refresh or the GPU can’t give it 60+ fps or the equivalent response time in MS. Why? Because having the absolute latest information available to your monitor cuts down one notch in the latency game. It’ll start a refresh as soon as the information is available. FPS, refresh rate, and response time are not all inter-independent on each other.

      • Welch
      • 6 years ago

      LOL I’d love to see the down vote system require people to say why they disagree with the author of comments. I don’t see how I said anything extremely arguable. At least reply to the post.

        • NeelyCam
        • 6 years ago

        ??? Do you not see all the replies to your post? Were they coming at you at too high a refresh rate…?

          • Welch
          • 6 years ago

          Yes, too fast of a refresh, you got me, I’m truly the Blind tester in all of this.

          lol, I didn’t see any of the replies because the mobile version of TR’s comments get sorted in different orders and aren’t even following one another.

          If the 72-75 was debunked, then great. But back when refresh rates and FPS in games were being discussed after a few years of LCDs came out, and the controversy started… I found a research paper from a group at a university that was meant to put an end to all of it. I should probably clarify that above 72-75 in the article was talking about what your brain perceives as true motion or fluidity. They accounted for some variance in NORMAL people and their hereditary differences by saying 72-75 and noted that some individuals can see more or less than that, but that most people fall into the 72-75.

          whether you can actually notice things beyond that, and if there are any benefits past that, I don’t know. I do think relying on “Any gamer can tell you”… is a crock of shit as some people have it in their heads that its faster so they swear it is. I’d like to see if a person going from a 75 or 85hz refresh rate vs a 120 hz can tell a difference. And not a 120hz that just buffers an extra frame like the “240hz” TVs were doing, that’s just trickery.

          They said that 88% of subjects could tell the difference…. great, who cares. But can that 88% BENEFIT from it is what matters too. The article also said that no improvements in kill/death ratio’s were noticed (which is a very arbitrary metric anyhow)

          I’ll see if I can find the article I read and see if its even relevant or if its been debunked.

        • Bensam123
        • 6 years ago

        Read my post about neural plasticity which you decided to gloss over before making a reply right below it.

      • Airmantharp
      • 6 years ago

      You talked about people downvoting, so here’s a quick point-by-point:

      The number (that doesn’t really exist) has been pegged around 250FPS for fighter pilots. Hard part is that the testing is more limited by available testing equipment than anything else, as well as by human error, and it depends highly on the test material. But as a quick example, I can tell you positively that anything under 85Hz on a CRT was too low for me; 60Hz was painful, and 75Hz was tolerable, but I needed 85Hz to be actually comfortable, and yes, I could tell you the difference from across the room.

      HDTVs use a ‘sampling’ algorithm to produce frames between the transmitted frames of a signal. Very, very few can actually accept a 120Hz 1080p input; most are limited to 60Hz and typically up-sample to their ‘panel speed’ internally only. I doubt this is anything new to you, but you did leave it a bit ambiguous. Note that even 3D at 1080p really only requires a 48Hz input (or 23.976Hz * 2) for most films, so supporting a maximum of 60Hz is good enough.

      You’re right on target with having a GPU that can put out an acceptable frame rate, and that it needs to do so evenly. If 120Hz is the goal, you should probably be willing to accept lower overall graphics settings.

      And AOC displays ‘are pure junk’; I have to believe that any company can put out a decent display if they so choose, and I know I’ve seen examples of AOC panels being used to excellent effect. And hating on HP? I bought my HP ZR30w over the more expensive and input-lag prone Dell U3011 without blinking an eye, and I have a pair of the Dells on my desk at work. The HP is definitely up to snuff.

        • Generic
        • 6 years ago

        “I can tell you positively that anything under 85Hz on a CRT was too low for me; 60Hz was painful, and 75Hz was tolerable, but I needed 85Hz to be actually comfortable, and yes, I could tell you the difference from across the room.”

        This.

        Step one when troubleshooting an acquaintance’s box: Fix the damned refresh rate…

        Step two: Wave hand in front of monitor to illustrate to them what offends me so.

      • Derfer
      • 6 years ago

      I believe the airforce said something about pilots maxing out at about 400 fps.

        • Bensam123
        • 6 years ago

        Just another random number and all the more reason to believe there is no number for this. 400fps is a nice round figure to lump all pilots under.

        • jensend
        • 6 years ago

        The airforce deal was recognizing a single picture that flashed on an otherwise dark screen for 1/220 of a second. That bears no relation to frame rate. You might as well claim that our ability to perceive the images on a CRT (which are only bright for ~50 microseconds at a time) proves we’re able to see 20,000fps.

      • Firestarter
      • 6 years ago

      [quote<]anything more than that really can't be perceived[/quote<] unless you provide a source for such claims you can stop your argument right there

    • superjawes
    • 6 years ago

    Getting a good test on refresh rate versus [other factors] would be difficult, since gamers tend to use computers for a lot of reasons. For example, TN viewing angles are fine when all you’re doing is gaming (looking straight on), but if you want to watch a movie, you might not be looking straight on anymore, and viewing angles become a pain.

    But we’ve already discussed this to death, so let’s just all agree that we would love 120 Hz 8-bit IPS monitors with snappy response times and higher resolutions at affordable prices.

    Yeah, yeah, not going to happen any time soon, but can we at least get some monitors with a several of those features?

    • RandomGamer342
    • 6 years ago

    Once you go 120hz, you never go back…

      • LukeCWM
      • 6 years ago

      I’ve gathered from reading TechReport that there are a lot of expensive things you can’t go back from. They were right about SSDs. I’m afraid to try 120Hz, IPS, or mechanical keyboards because they’re probably right about those too. =\

        • jessterman21
        • 6 years ago

        I know, right? I had no idea what aliasing was until it was pointed out to me on a tech site. Same for anisotropic filtering. And since I got my SSD last year, I am painfully aware of the significant amount of time I spend waiting on spinning hard drives at work.

        P.S. – I tried dropping the refresh rate to 40Hz on a laptop I was staging the other day and it made me want to cry.

      • hoboGeek
      • 6 years ago

      You realize it doesn’t rhyme? It’s never gonna stick. Try something like :
      “Once you go one twenty hz, going back it really hurts”

      • Arclight
      • 6 years ago

      120Hz viewed,
      Immediately wooed.

      • Chrispy_
      • 6 years ago

      [b<]I went back, and I won't try 120Hz again until it's better than TN.[/b<] I already had a 120Hz Samsung S27A750 - which based on my in-store comparisons was the least horrible 120Hz TN panel I could find. Decent colour vibrancy (for a TN) and an elegant design. Then, I bought a cheap 27" 1440p Korean screen for about $320 or so (Achieva Shimian) and put them side by side. One for web, CAD, movies - and one for gaming. It took me less than two months before I decided to eBay my Samsung. Side-by-side, the difference between IPS and TN was so much more important than the difference between 60Hz and 120Hz. I'm not a twitch gamer anymore but I still play fast paced games all the time. I can run my IPS screen at 85Hz, but I've chosen not to based on the lack of warranty and reports that some people burn out their logic boards running above the rated 75Hz.

        • NeelyCam
        • 6 years ago

        [quote<]reports that some people burn out their logic boards running above the rated 75Hz.[/quote<] That doesn't make sense, unless those logic boards automatically overvolt at higher frequencies

        • Pettytheft
        • 6 years ago

        I went back to a IPS display because TN’s are really that bad. Yes I’m sitting directly in front. The colors were so washed out I couldn’t stand it. I had a old dell 2001FP and jumped in the 120Hz train. Kept the old Dell for a second monitor I could never get my Acer to look proper. Finally got a Korean monitor and it’s worth every penny, I even prefer gaming on it.

        • RandomGamer342
        • 6 years ago

        This is why i’m using the 1440p 120hz IPS catleap that was linked to in this article

          • Chrispy_
          • 6 years ago

          Yeah, it’s a shame they didn’t exist when I bought my screens :\

    • JdL
    • 6 years ago

    How the heck do you do a “Blind test” when it comes to monitors? So 120 Hz FEELS better? I believe you, but it’s hard…

    /hehehe

      • Lans
      • 6 years ago

      Well, I am not exactly sure about the questions/questionnaires they used but seems like they had a bunch of the same model monitor capable of 120hz but can be set to 60hz mode and locked menu access some how. Each person then had chance to experience 60hz and 120hz, randomly and not told one would assume. In that way, it would seem like a good/valid blind test.

      Like others I would had like blind test of IPS/better visual quality monitors vs. 120hz/TN/more responsive monitors. I assume one had to get the exact same monitor sizes and cover up bezels or what not though (instead of just simply changing settings when there is only one model being used).

    • Prestige Worldwide
    • 6 years ago

    Not surprised. I went 120hz with BenQ XL2420T last year, would never go back. Best addition to my gaming rig I’ve ever made.

      • hoboGeek
      • 6 years ago

      I am willing to bet that the best addition to your gaming rig was the video card, I can only guess you are not using the on-board one.

        • Prestige Worldwide
        • 6 years ago

        Already had a video card at the time of purchase. Arguably two; GTX 295 SLI on a stick. The monitor was an addition to an existing rig.

        I upgraded to a GTX 670 a few months after the purchase, and SB-E a year later, and in terms of a jaw-dropping gameplay fluidity increase, going 120hz is the best upgrade I have ever made.

Pin It on Pinterest

Share This