BenQ Zowie XL2540 monitor spits out frames at 240Hz

BenQ acquired eSports fanatics Zowie last year, and has been busy stamping the name on its gaming monitors. The latest display bearing the Zowie name is the XL2540, a 24.5" 1920×1080 monitor whose claim to fame is its native 240Hz refresh rate.

This isn't the first 240Hz monitor we've seen; Asus' PG258Q also supports a 240Hz refresh rate. This is the first one we've seen with blinders attached, though. Zowie refers to the rotatable plastic panels as a "Shield" and claims that the peripheral-vision-blocking plates will improve gamers' focus. The XL2540 uses a TN panel, of course—possibly this one from AU Optronics. BenQ offers no word on the panel's viewing angles or color reproduction, so we wager those characteristics aren't priorities for this display. The company does say the brightness can hit values up to 400 cd/m², so folks who hate their retinas can burn them right out.

Curiously, there's no mention of FreeSync on the monitor's product page despite AMD listing it as supported on its own site. According to AMD, the XL2540 supports a variable refresh range of 48Hz to 240 Hz via DisplayPort or HDMI. Given the extremely high refresh rate and 1-ms response time, this should be one of the sharpest monitors for CS:GO players and arena FPS fans.

As a convenience feature, the XL2540 includes a puck-like "S Switch" to allow for quick switching between display presets. Users can hook up the XL2540 using a dual-link DVI input, a DisplayPort, and two HDMI ports. Additional connectivity options include audio ports and a three-port USB hub.

Even though BenQ is only now announcing the XL2540, B&H Photo Video has had it available to pre-order for $499 for almost a month. The site says it expects the display to be available on November 15.

Comments closed
    • Bensam123
    • 3 years ago

    More Hz! It’d be nice if we could get under 1ms as well. I don’t think that’s a place everyone needs to stop because you have to go to a decimal.

    It’s also interesting, with the focus on ultra low latency, recognition (such as non-blurred images, Gsync and Freesync are examples) and getting frames as fast as possible, I wonder if we’ll eventually start seeing a focus from the top down (AMD/Nvidia) on specialized hardware aimed at high frame rates, scene purity, and low latency instead of just throughput.

    I know that sounds weird like it’s synonymous, but it’s not. Depending on what you set your settings to in game, it drastically changes the workload on a graphics card. Something like absolute low settings in Overwatch for instance will completely change the workload on a GPU, where something like a extremely fast (Mhz) and lean GPU will perform much better then the full fat counterparts.

    As far as a venture goes, they wouldn’t need to produce a brand new chip right out of the gates. Simply cutting down some of their bigger chips and binning them differently (getting a super fast chip Mhz wise) would be achievable. If sales dictate it, they could change how they approach it as well.

    It’s been something I’ve been thinking about for quite some time. Input latency is only something that can be tackled on the developer end as well. You’d be surprised how fast you run into a CPU bottleneck, especially in Overwatch for instance when trying to eclipse 140fps. I changed from a R9-290 to a 1070, both with a 4690k and there is almost no difference in FPS operating close to the lowest settings (roughly 140-260fps). Obviously buying the 1070 wasn’t just a choice to playing Overwatch, rather just a option I had available to me. Next will be changing to a better processor which I’ve already mentioned (such as a hex core), which will be the biggest difference.

      • tipoo
      • 3 years ago

      It’s a little funny how old CRTs had 0.001ms response times, everything since the switch to flatter panels has only been trying to get back to where we already were. And still off by magnitudes.

      There are other benefits to LCDs of course, but it’s interesting when very old tech still beats very new in any area.

    • BigDDesign
    • 3 years ago

    I have the BenQ XL2430T 1ms 144hz monitor. I play CS:GO pretty much daily. Get just under 300fps with my setup. The magnitude of difference in changing to this monitor was so great for a twitch FPS shooter like CS:GO is an understatement. In order to really have 144hz work well I am surmising (no proof) 200fps is a good idea for amazing gameplay in CS:GO. Many of us have discussed it that play the game and have 144hz monitors.

    So Zowie has another monitor out that is available now with the blinders in 27″ with new technology to get less blur than even mine. It’s a 2560 x 1400 27″ 144hz 1ms *$699 with “Dynamic Accuracy”. On their website Zowie shows how much less blur there is than before with this new “Dynamic Accuracy” technogology. Not sure about the new 240hz tech. Personally, I would have to see testing. And how much more video card will you need. At 1080p I still think you would need a GTX 1070 to run CS:GO at 240hz and over 200fps. I know it sounds silly that CS:GO has to run at such high frame rates… but it really is essential for any chance against really good players. And there are some amazing ones out there. Every day I am amazed at some of the talent that this game brings to the table. Also remember Zowie/BenQ monitors are the choice of E Sports gaming events.

    • ronch
    • 3 years ago

    Cool. Now I want my next display to have Mickey Mouse ears.

    • Chrispy_
    • 3 years ago

    Zak, I really like your cynical writing style.

    [quote<]BenQ offers no word on the panel's viewing angles or color reproduction, so we wager those characteristics aren't priorities for this display. The company does say the brightness can hit values up to 400 cd/m², so folks who hate their retinas can burn them right out.[/quote<] Grade-A, [i<]premium[/i<] snark. Bravo sir, bravo...

      • pranav0091
      • 3 years ago

      You could say Zak is….
      <puts on sunglasses>…
      killing it.

        • UberGerbil
        • 3 years ago

        [url<]https://youtu.be/6YMPAH67f4o[/url<]

      • RAGEPRO
      • 3 years ago

      Well thanks, partner.

      • DrCR
      • 3 years ago

      Thanks for taking a moment to comment on this to induce me to do so as well. Such commentary is what I sometimes take for granted here, but it’s what keeps be coming back vs the sites that just rehash press release info.

      Bravo, indeed, Zak.

      • bjm
      • 3 years ago

      Hah! I was going to comment the same thing. I’m liking his style.

    • Mad_Dane
    • 3 years ago

    Just need to grow to 40″, 4K resolution and an IPS or OLED panel with at least 1.07 billion colors.

    But hey I’m not a pro-gamer so I’m not the target for this product ,I know.

    • Anovoca
    • 3 years ago

    in the 2.0 version, I hear they will just package the monitor with blacked out goggles that have a small rectangle of clear plastic in the middle of each lens.

    • Firestarter
    • 3 years ago

    No mention of any blur reduction modes. I guess it would have been too much to ask for a usable blur reduction mode at 240hz (not to mention the game should actually run at that rate for it to work best), but it would have been nice to have at something more reasonable like 120hz. CS:GO at 240 FPS and 240hz with blur reduction would be a sight to behold though

      • RAGEPRO
      • 3 years ago

      I agree. I actually had something about that in this piece but it got cut in editing. 😛

      BenQ really should have included their blur reduction feature on this. 4ms ain’t bad but it ain’t the 1ms that Lightboost can provide.

        • drfish
        • 3 years ago

        [quote<]I actually had something about that in this piece but it got cut in editing. :P[/quote<] Oh snap!

        • Firestarter
        • 3 years ago

        it’s about how our eyes see the image, not about pixel response time. You could have a maximum of 16ms pixel response time on a 60hz display with backlight strobing and you’d get sharper motion tracking than with a 240hz screen with 1ms pixel response time and a regular backlight. As long as all the pixels have the right color/brightness when the backlight strobes, it doesn’t matter how long the transition time was (except for input lag). But if the backlight is on continuously, our perception of motion on screen will be blurred regardless of how good the screen is, unless the image it displays also has continuous motion (which it does not)

        Granted, if the game runs at 240fps and the screen updates at 240hz, we’re getting a lot closer to that continuous motion, so that reduces the blurring effect that happens when our eyes track motion onscreen, but even at 240hz it would still be beneficial to have a strobing backlight. What I really don’t know it how big the difference actually is at 240hz. I do know that the difference is quite stark at 120hz.

          • RAGEPRO
          • 3 years ago

          Heh, I dunno why you thought I was talking about response time. I’m talking about image persistence, boss. [url=http://www.blurbusters.com/faq/60vs120vslb/<]Check this link[/url<] for context on what I meant.

            • Firestarter
            • 3 years ago

            that’s exactly what I’m talking about and it’s not something that is measured in milliseconds. Or rather it is (strobe length), but if that was what you were talking about then I don’t understand where you get that 4ms from because no self respecting strobing display uses a strobe length of 4ms because that is 58% duty cycle when strobing at 144hz which is really rather silly

            Edit: to reiterate, a 1 millisecond strobe length at 120hz would result in LESS motion blur than a 1 millisecond strobe length 240hz, because the resulting duty cycle would be less (and the monitor would coincidentally most likely be dimmer)

            • RAGEPRO
            • 3 years ago

            [url<]http://www.blurbusters.com/wp-content/uploads/2013/06/motion-blur-graph.png[/url<] Preaching to the choir, chief.

            • Firestarter
            • 3 years ago

            Ah yes then I understand where that 4ms comes from as that is the approximate sample and hold motion blur at 240fps. I realized my previous statement about 1ms strobe length at 240hz being blurrier at 240hz than at 120hz is false, the amount of sample and hold blur is actually the same but at 240hz the screen would actually be brighter (higher duty cycle) and the motion tracking should be easier with twice as many samples

            Let’s hope they have something in the pipeline, I’m excited to see how well a 240hz screen with strobing at 240hz would work

      • DoomGuy64
      • 3 years ago

      Abusing the min adaptive refresh rate with CRU acts as a pseudo blur reduction mode via LFC. If you set the min to say 60, then 50 fps becomes 100, and the doubled refresh rate diminishes the noticeable blur. Speaking from experience with my MG279.

      While some monitors do support blur reduction as an additional option, that mode still doesn’t support adaptive sync and there are other issues with that.

      I think it will take a bit longer for monitors to come up with something that both reduces blur and supports adaptive sync. It might very well require a 240 hz screen that inserts a strobe every other frame, which would effectively make it a 120hz adaptive screen when using blur reduction.

    • Neutronbeam
    • 3 years ago

    I think the design is quite refreshing.

      • EndlessWaves
      • 3 years ago

      It’s big, grey and has two flappy bits either side of it’s head…

      [url<]https://youtu.be/k_JnCWT-_O8?t=159[/url<]

        • Neutronbeam
        • 3 years ago

        Funny–I went to college with Michael Stipe at UGA and Peter Buck is a fraternity brother (Emory chapter).

      • morphine
      • 3 years ago

      Naa-a-eigh!

    • ronch
    • 3 years ago

    [quote<]Zowie refers to the rotatable plastic panels as a "Shield" and claims that the peripheral-vision-blocking plates will improve gamers' focus.[/quote<] I knew those newfangled thin bezels were bad for us. We need a clear distinction between reality and computer graphics, you see.

      • geniekid
      • 3 years ago

      Why spend valuable R&D on making bezels thinner when you can convince people they want thicker bezels? Pure marketing genius.

    • chuckula
    • 3 years ago

    [quote<]This is the first one we've seen with blinders attached, though. Zowie refers to the rotatable plastic panels as a "Shield" and claims that the peripheral-vision-blocking plates will improve gamers' focus.[/quote<] Indeed. They block out the harsh harsh glow of the Day Star and the ugliness of RR (real reality).

      • DPete27
      • 3 years ago

      For competitive gaming in a large room/arena with flashing lights and lots of activity, I think the blinders are actually a useful add. For the average consumer playing at home….probably not.

        • travbrad
        • 3 years ago

        Maybe they are meant to counter the bajillion LED lights that get put on everything now.

      • Chrispy_
      • 3 years ago

      In the product shots above everything is turned off, but I suspect each peripheral-vision-blocking plate is actually inset with [i<]over nine thousand[/i<] RGB LEDs. It's impossible to block out all light from your peripheral vision, so the next best thing is broad-spectrum retinal searing until you are peripherally blind.

Pin It on Pinterest

Share This