AU Optronics panel to combine IPS tech, 144Hz refresh

The first 144Hz G-Sync monitors are here, and they're amazing. Unfortunately, they're also based on TN panels. TN tech may have gotten a heck of a lot better in recent years, but it's still not quite up to par with IPS.

Happily, we might soon get 144Hz G-Sync goodness and IPS image quality in the same monitor. TFTCentral reports that AU Optronics is developing a 144Hz panel based on AHVA technology, which the site describes as "IPS-mode" and "very comparable" to other IPS types—even in the viewing angles department, where TN usually falls short.

The panel has a 27" diagonal, a 2560×1440 resolution, and an sRGB color gamut. Horizontal and vertical viewing angles are 178°, just like on any self-respecting IPS panel. And the rated contrast and brightness are pretty darned good, too, at 1000:1 and 350 cd/m².

According to TFTCentral, this messiah of panels will enter production in September. The site doesn't say when the promised one will actually grace us with its presence, but if previous holy births are any indication, perhaps we'll see something around Christmas. I'll keep my eye out for a new star in the night sky, in any case. (Thanks to TR reader SH SOTN for the heads up on this one.)

Comments closed
    • Meadows
    • 8 years ago

    First of all, display bandwidth is limited, unfortunately. Secondly, as superjawes pointed out, 144 Hz is the next “film-compatible” incremental improvement they could manage to cram in without overdriving the usual connectors, and might be the last until new connector revisions or types appear.

    • cobalt
    • 8 years ago

    Good point; I hadn’t thought about multiple strobing re-introducing sample-and-hold effects. I guess what you wind up with in that case are PWM-like artifacts. It’s less smeary but more stuttery, and possibly worse. Ugh, there went my one idea. (blurbusters shows this artifact, by the way: [url<]http://www.blurbusters.com/faq/lcd-motion-artifacts/#pwm)[/url<] Seems like the only possible solution is to introduce one frame of lag so that you can adjust brightness for frame x based on your already-known frame time before frame x+1 appears.

    • GrimDanfango
    • 8 years ago

    That’s true. Guess it does come down to a fight between strobing or infinity-fps then 🙂 Maybe we can find a sweet-spot somewhere in the middle.

    • GrimDanfango
    • 8 years ago

    Actually, the other major problem with strobes is that you need at *least* 60 per second, and ideally 80+ to not give yourself a headache/seizure… the CRT days taught us that.

    That is close to being an incompatible notion with G-Sync, as it is most effective at sub-60fps speeds.

    You could pulse twice per frame, but that would defeat this whole point of reducing sample-and-hold, by showing you the same frame twice… you’d re-introduce the exact same stair-stepping effect.

    It could very well work if you’re able to keep a game running at a minimum of 75fps… which happens to be the exact same issue facing the Rift. If you don’t hit that minimum framerate, the effect starts to break down.

    • rahulahl
    • 8 years ago

    Yes, in most cases it is. However its less of an issue in bigger sized monitors.
    Although I would say that pixels have nothing to do with it. Its the physical size of the monitor.

    • cobalt
    • 8 years ago

    [quote<]The monitor is waiting for a frame, when it arrives the intensity or duration of the strobe is simple a factor of how long it has been waiting. [/quote<] Right, and I think that's an interesting idea, but I'm not sure it would work perceptually. You'd get the right brightness over a long time (several frames), but I suspect it would not work visually. I suspect for a slow frame time, you have to make the [i<]preceeding[/i<] frame brighter to keep your photoreceptors firing during the longer dark time following it. I worry making the [i<]succeeding[/i<] frame brighter would be too late to prevent high frequency perceptual brightness variations -- probably flickering. (I'm not sure I'm right, of course, that's just my speculation.)

    • Chrispy_
    • 8 years ago

    Yeah, sample-and-hold blur is only going to be defeated by short strobes and long dark periods.

    I still think it would be trivial to implement strobing with G/Freesync though: The monitor is waiting for a frame, when it arrives the intensity or duration of the strobe is simple a factor of how long it has been waiting. If it waits longer for a frame it can either increase brightness, or if it needs to produce more brightness/time and it’s already at max brightness, just increase the amount of time it spends on the pulse at max brightness.

    I suspect the next generation of Free/G-Sync will also include strobing; My guess is the reason they didn’t do it yet is because there’s a limited amount of time these G-Sync screens will be on the market before they’re superceded by Daul Free/G-Sync varieties. They want to capitalise on the early-adopter tax as early and as hard as they can.

    • cobalt
    • 8 years ago

    But if your eyes are tracking a moving object, which we can do very accurately, you expect NO motion blur. At all. The sample and hold phenomenon with the pixel quantization of space introduces motion blur where there should be none. Literally, because of sample and hold, the image is sliding slightly back and forth across your photoreceptors instead of staying in one spot on the same set of photoreceptors.

    That’s what the shorter strobe is solving; it prevents the false smearing across your retina of an object moving in quantized space at the same rate your eyes are tracking in continuous space.

    • JosiahBradley
    • 8 years ago

    But they do, there are forum threads dedicated to using TVs with 120/240Hz Inputs.

    [url<]http://www.overclock.net/t/1401149/true-120hz-from-pc-to-tv-list-of-tvs-successful-overclocking-of-hdtv-and-plasmas[/url<] [url<]http://www.blurbusters.com/overclock/120hz-pc-to-tv/[/url<] Also any 4K TV will accept a minimum of 120 if not 240 at 1080.

    • GrimDanfango
    • 8 years ago

    I would have thought distraction was an even quicker way to readjust. It didn’t require concentration to end up prefering 60fps, it was just a natural function of watching it, like my eyes adjusting to the sun after being indoors. I get the feeling most people just decide to hate HFR the moment they see it, and all they remember is that initial reaction.
    After watching the whole of The Hobbit, I’d like to have seen people’s reactions to watching the same movie in 24fps straight afterwards, or even switching half-way through. They’d probably have thought the projector was broken 🙂

    I don’t know what about filmmaking could be tailored to a low framerate that would specifically look bad at a higher one… the most they could do is limit the speed of onscreen action, or limit the speed a camera pans, to avoid turning a 24fps movie into an indiscernible blurry mess. Slow camera pans would still look at least as nice at 48fps.

    The one potential “problem” with HFR is that a lot of CGI tends to specifically rely on crappy frame rates and motion blur to “hide” a lot of cheaping out in the production process… which I think was the primary issue with The Hobbit. The effects were definitely not up to the quality of LOTR… they spread themselves way too thin trying to pack every minute of the movie with unending CGI.

    • BobbinThreadbare
    • 8 years ago

    How does having a preferred aspect ration become pointless? 16:10 is a more pleasing ratio.

    • Derfer
    • 8 years ago

    16:9 complaints become a bit pointless above 1080p. The big complaint then was that 1200p was phased out and replaced with a res with less vertical space. Now all these higher res 16:9 options provide plenty of vertical pixels.

    • superjawes
    • 8 years ago

    Well a movie might be a little different. In addition to the human conditioning, the indutry is used to shooting at 24 FPS, and I have a feeling that certain techniques for camera movements are tuned to that, which could result in “weirdness” at the faster speed.

    Also keep in mind that your video gave you time to think. A film is going to distract viewers from the frame rate with story elements, so viewers might not have the same sort of time to focus on the frame rate. A theater is also only going to show one speed on the screen, so there is no way for a viewer to put the two products side-by-side. They just know that one looks different from what they’re used to.

    • GrimDanfango
    • 8 years ago

    That’s a fantastic measure of physical reaction time, and has absolutely nothing to do with the acuity of your vision.

    The very fact that anyone claims any frame-rate as being the maximum humans can perceive instantly marks them out as talking nonsense. For one thing, the responsiveness of the human optical system is massively influenced by light-level… if you show someone a very dark moving image, they might not tell the difference between 15fps or higher, but show them a brightly lit, well exposed moving image, and some people will easily tell the difference between 120 and 200fps.

    As with most things biological, there’s no magic number for “science” to discover. There’s just diminishing returns.
    144hz is a reasonable cutoff at the moment, as it requires very specific circumstances to detect a noticable difference in anything above it. That’s certainly not to say that we *can’t* perceive any difference.

    • GrimDanfango
    • 8 years ago

    I don’t understand so many people complaining about higher frame rates. I compared a 24fps video file, against a 60fps video shot by the exact same camera, panning slowly across the exact same scene…
    For the first minute or so, the 60fps one looked oddly-sped-up, and the 24fps one looked “natural”… but after about 5 minutes of comparing them, the 60fps one just looked perfect and lifelike and amazingly crisp, while switching back to the 24fps one was horrible, suddenly it looked like a hideously stuttering slideshow.
    It took me roughly 5 minutes to un-condition my brain. Does it really throw so many people off the the length of an entire movie?

    • GrimDanfango
    • 8 years ago

    It’s interesting reading, but I still think there’s a slight misconception at work here. The problem isn’t that there *is* motion blur… it’s that typically in most computer games, there is *no* motion blur integrated into the rendering itself, but the screen displays a frame for a full frame-length of time anyway. Our eyes expect to either see an object move in that time, or at least leave a blur, rather than just showing as a perfectly sharp image for the entire length of the frame time.

    Low persistence displays mean that our brains can fill in the literally-missing information between the brief blinks of frame that we do see, rather than being given the *wrong* information, and which our brains just find slightly confusing instead. What we actually need in an ideal situation is physically accurate motion blur for the duration of the frame, rather than the current situation which actually has no motion blur at all. (Well, a truly “ideal” situation would just be infinity-frames-per-second 🙂

    • superjawes
    • 8 years ago

    [i<]The Hobbit[/i<] was released at 48 FPS (at least the first one was). The higher frame rate put some people off because we've been conditioned to 24 FPS over a century of film, but I suspect that most people would prefer higher rates given some conditioning.

    • rahulahl
    • 8 years ago

    I wouldn’t mind it.
    Or at least I wouldn’t have, if I had not already bought a Rog Swift. The TN is annoying, no matter how good it is. Especially since I have been using high quality IPS after moving on from my CRT, and this was my first TN monitor.

    • rahulahl
    • 8 years ago

    If 30 FPS for movies was the maximum humans can perceive, then people wouldn’t be bothering to make movies at higher FPS. From what I heard, The Hobbit is gonna be at 48 FPS.

    In fact, just try a camera with a higher frame rate, and you will see that the picture is so much more fluid.

    • rahulahl
    • 8 years ago

    Even assuming the 200ms figure (which I dont believe in) is true, that human lag is on top of what the input lag is. That 200 is not gonna change regardless of what monitor you use. But if you use a faster monitor, you certainly will react faster to information you see earlier. With a slower monitor that takes time to display the images, you will have a delayed reaction, because you are simply getting the information later.

    And I can assure you that playing a game with 30 FPS you will feel the control scheme is going to be a lot looser than if you were to play with on 60 FPS. Usually, the higher you go, the better your controls will feel. And the better your controls feel, the better you will play most games, which in turn means you get more enjoyment.

    Just try and play a game with FPS reduced to 10, 30, 60 and see the difference. And I can assure you that the difference just doesn’t magically stop at 60 FPS. It will keep on improving though at diminishing returns.

    Try a First or Third person game to see the difference.

    • superjawes
    • 8 years ago

    You’re talking about [i<]minimums[/i<] for animations to appear fluid. That doesn't mean that humans [i<]can't[/i<] perceive higher frame rates (we can).

    • Tristan
    • 8 years ago

    Reaction time:
    [url<]http://www.humanbenchmark.com/tests/reactiontime[/url<] Human needs only 30FPS for fluid animation, and movies have this speed. But computer games generate stroboscopic images without motion-blurring, so there are less informations for brain at 30 FPS. This is compensated with 2x (or more) FPS, when images blends over time on retina, and 'motion-blurring informations' are partially restored.

    • cobalt
    • 8 years ago

    [quote<]The benefit of strobing on a static non-tracking display is in hiding the unsightly pixel transitions, not in reducing motion blur.[/quote<] It really is useful for both. Even on a normal monitor, as you scroll text on a page, or pan the camera around, your eye is following some sort of motion on the screen. When that happens, the sample-and-hold effect means every image is blurred by one pixel distance on each side. This happens even if the pixel transition time is 0ms ; there's no way around it except to leave the display on as little as possible. The BlurBusters explanation is pretty useful, I think: [url<]http://www.blurbusters.com/faq/oled-motion-blur/[/url<]

    • GrimDanfango
    • 8 years ago

    Where do you get “Human lag is 200ms” from? What does that even mean?

    Is this wisdom from the same school of “science” that decided 60fps was the most that human vision could perceive?

    • superjawes
    • 8 years ago

    They upped the resolution “a bit”? 2560×1440 offers 77.8% more pixels than 1920×1080. That’s almost double the resolution. On top of that, 1080p is common, 1440p is not, so 1080p panels are going to be cheaper due to higher volumes.

    And $200 for an “upgrade kit” is perfectly reasonable considering that said kit is a custom FPGA implementation. The FPGA chip on its own is probably $100 (at least), and that’s what it costs Nvidia to buy, and that’s before you add it to the circuit board.

    Now is the profit margin higher on the 27″ G-Sync ROG versus a $200 1080p display? Probably, but people are still buying it. It’s not exactly gouging if people are willing to pay it. I’d like it to be cheaper, sure, but there are much stronger forces than one voice (or even several).

    • l33t-g4m3r
    • 8 years ago

    Current prices are super gouged. The original 120hz overclockable ips monitors started out around 300, then inflated to around 500 after initial demand, then quickly rose higher to 800-ish. Where’s the justification for the price increases?

    As for the TN panels, the original G-Sync monitor was a $200 1080p TN panel modified with a $200 gsync upgrade kit. We up the resolution a bit, and the price magically becomes $800. Where’s the other $400 coming from? Total rip job, not to mention $200 for an upgrade kit is pretty questionable by itself.

    Yeah I’d like one of these screens, but I’m not gonna bite until prices start being sane and reasonable. Especially when my current IPS monitor overclocks past 100hz, and I didn’t pay anywhere near these ridiculous prices.

    • Tristan
    • 8 years ago

    Human lag is 200ms. Thanks to variable speed of freesync (better frame adjusment in time), lags will be lower than today fixed 60Hz. Probably at the same level like 120/144Hz or even lower

    • GrimDanfango
    • 8 years ago

    I think the persistence thing is only an issue if you’re using some form of motion sensing to track a moving image to the viewer’s head movement… ie, it’s only an issue on the Rift, or presumably with TrackIR and the like.
    For a regular monitor, when you move your eyes/head, the image stays where it is, so you aren’t subconsiously expecting an unblurred movement across the screen.

    The benefit of strobing on a static non-tracking display is in hiding the unsightly pixel transitions, not in reducing motion blur.

    To call it motion blur at all isn’t quite right in the case of display devices. Motion blur is blurring caused by the motion of the subject being recorded by a video camera, and you *should* have a certain amount of it, relative to the length of a frame, otherwise video will look unnaturally strobey – there’s a reason why filmmakers don’t shoot video with 1/4000 shutter speed 🙂

    • GrimDanfango
    • 8 years ago

    Indeed, TVs are barely even geared towards 60hz… they’re all built around motion-interpolating 24/25/30hz source material, and claiming preposterous refresh rates as just another marketing gimmick.
    My 6-year-old plasma TV is apparently 600hz!

    • Airmantharp
    • 8 years ago

    TV’s don’t have true >60Hz input signals.

    • cobalt
    • 8 years ago

    To be clear, I was mostly responding directly to the “double the video card budget”, and making a purely mathematical argument, assuming a monitor would last at least twice as long as said video card.

    But I totally agree these are all a bit too pricey. Any $700 monitor had better last me a LONG time, because the GTX 460 I bought 3 years ago for less than $150 has lasted remarkably well. (Think it’s time for a new card, though. Finally…..)

    • JosiahBradley
    • 8 years ago

    Why is it so easy for TVs to have IPS or VA panels and true 120/240 Input signals but so hard for a monitor to have it? Isn’t a TV just a really big monitor with an IO board geared toward media and not a computer (HDMI inputs versus DP).

    • SomeOtherGeek
    • 8 years ago

    I know! It is gross!

    • mesyn191
    • 8 years ago

    I agree with your reasoning. The problem is most just don’t have that sort of money to spend period.

    • cynan
    • 8 years ago

    What? 16:9?! Who on earth would want this piece of crap?

    • cynan
    • 8 years ago

    He obviously has a 4K display. Or maybe he’s been to the future and has one of [url=https://techreport.com/news/27021/dell-5k-desktop-display-is-the-new-king-of-megapixels<]these[/url<]?

    • Pwnstar
    • 8 years ago

    No, you still need low frame times to reduce input lag. 40 FPS is 25ms of lag, where as 120 FPS is 8ms of lag, much lower!

    • Pwnstar
    • 8 years ago

    THE GABEN IS HUNGRY!

    • cobalt
    • 8 years ago

    The strobing does two things, though, doesn’t it? Apart from merely hiding the pixel transitions, it reduces the persistence of the display overall, reducing the motion blur you see from eye motion (i.e. the “sample and hold” problem). To do that, the “On” time needs to be very short — or phrased alternatively, the Off time is going to be much longer than the On time.
    ( [url<]http://www.tftcentral.co.uk/articles/content/motion_blur.htm#oscilloscope[/url<] looks like something around 4x longer Off than On ) I would think that would limit what you could do with that type of solution while still retaining the benefits. It's interesting to think of it in the backwards direction. But I'm not sure you can adjust the "on" time of the successive frame to compensate for the prior frame's length. I think you need to know how long your "off" time is FOLLOWING the "on" time, so you can give your rod and cone neurons a big enough jolt to compensate. I suspect trying to do it afterwards would cause flickering, or other perceptual issues, but that's merely speculation. The best solution I can think of would be to strobe at a higher frequency and have your vsync updates be quantized in multiples of that higher frequency. E.g. at 240 Hz, you could have any given frame be 30, 34, 40, 48, 60, 80, or 120Hz. It wouldn't be continuous, and it's a big jump from 60 to 80 to 120, but that's still a lot of values between 30 and 60 where you'd need it most.

    • Pwnstar
    • 8 years ago

    $500 is my max.

    • Chrispy_
    • 8 years ago

    Like the G-Sync monitors, it hasn’t been done yet – they either run in strobe mode or G-Sync mode, rather than both – but the problem is definitely solvable (if not already solved);

    By treating it as an always-on backlight that only turns off when the new frame arrives and the pixels change. The longer it was since the last frame, the shorter the [i<]off[/i<] period needs to be to compensate for what your retina perceives as [i<]brightness/time[/i<] No complex prediction, really simple algorithm. My guess is that manufacturers won't want to make seperate G-Sync and Freesync models, so they'll implement a strobe mode for the Nvidia customers that Freesync customers can also use. Seriously though, I'm getting excited by the prospect of a strobing, IPS-like, 1440p monitor with adaptive [i<]freesync[/i<]. [b<]Finally it'll be an improvement over CRT's in every way, not just [i<]some[/i<] ways![/b<]

    • Tristan
    • 8 years ago

    144 and 120 Hz not necessary alerady. Thanx to FreeSync, Gsync or new DP 1.3, 40-60 FPS will be enough

    • cobalt
    • 8 years ago

    From my interpretation of this post, strobing is not part of the spec itself:
    [url<]https://techreport.com/forums/viewtopic.php?f=3&t=94725#p1215649[/url<] That's not to say a monitor wouldn't support it, of course, but sounds like it's not guaranteed. And it's likely to have the same problems as G-Sync with ULMB strobing anyway -- as of today, you have to choose between the strobing and adaptive sync, you can't have both enabled. (I'm no expert, but my understanding is that if you strobe the backlight, you need to know how much brighter to make it to compensate for the shorter duration, and that's based on how long until the next strobe, which you don't know because you won't know how long until the next frame is ready. I'm hoping this is a solvable problem, though.)

    • Firestarter
    • 8 years ago

    apparently, technically different technologies are the best kind of different technologies

    • Firestarter
    • 8 years ago

    I’m very curious as to how much ghosting TFTCentral can detect when they review this monitor. If this monitor manages to deliver on the expectations that the specs set, then I might just have to empty my wallet!

    • Firestarter
    • 8 years ago

    why would you want a 295 X2 if all you have is a 60hz display?

    • culotso
    • 8 years ago

    This is the holy grail.

    • Airmantharp
    • 8 years ago

    Panels don’t have G-Sync/FreeSync/Adaptive V-Sync or anything of the sort; monitors do.

    • superjawes
    • 8 years ago

    I’ve yet to see anyone say that this display would have G-Sync…what I do see is people hoping that this eliminates the TN requirement for G-Sync displays, as all G-Sync displays have maximum refresh rates of 144 Hz, which is not currently available on IPS (or IPS-like) models.

    • 0x800300AF
    • 8 years ago

    where exactly does AUO mention this panel with have G-Sync ? It is not listed anywhere else other than a wishful TR news post (see above) and several forum post by eqaully wishful fans. If you are going to post your own hopes into news posts might as well read: “Happily, we might soon get 144Hz G-Sync/Adaptive Sync. DP 1.3, HDMI 2.0 goodness and IPS image quality in the same monitor.

    • Chrispy_
    • 8 years ago

    I’m hoping strobing and low latency are part of the freesync spec 😉

    • DarkUltra
    • 8 years ago

    You forgot backlight strobing and zero input lag.

    • odizzido
    • 8 years ago

    Take out gsync and lower the price.

    • Chrispy_
    • 8 years ago

    Freesync, 144Hz, 1440p, non-TN?

    [b<]TAKE MY MONEY!!![/b<]

    • superjawes
    • 8 years ago

    From the first link:
    [quote<]AHVA is their equivalent to LG.Display's IPS (In Plane Switching) technology, as is Samsung's PLS (Plane to Line Switching). Both were designed as a competing alternative. All 3 technologies (AHVA, IPS, PLS) are very similar in characteristics and performance in practice, and are often simply labelled as "IPS" by manufacturers. This is why we refer to them as IPS-type in this news piece.[/quote<] How's that? Basically, it should offer the same IPS effects (good colors and viewing angles), but it is technically a different technology.

    • MadManOriginal
    • 8 years ago

    Yep. I also want to know what ‘IPS-like’ means…it’s more than viewing angles.

    • DPete27
    • 8 years ago

    Do you even know what screen tearing is?

    • Prestige Worldwide
    • 8 years ago

    Hallelujah, the good lord GabeN has smiled upon us today.

    • bfar
    • 8 years ago

    If we could get this kind of tech along with gsnc/freesync & 4k at mainstream price levels, that’s what the PC gamer crowd is after. Unfortunately we could still be looking at a few years before this is a reality.

    • crystall
    • 8 years ago

    Glad to hear that the T420 panel is not indicative of their general quality but I was still burned pretty badly and with me quite a few other people (replacing the T420 panel is quite a popular topic if you google around for that model).

    • ozzuneoj
    • 8 years ago

    While I don’t have any experience with the T420 screen you mentioned, I replaced the broken 768P screen in my Asus Q500A with an AUO B156HW01 from a Thinkpad and it has been fantastic. By far and away the best laptop screen I’ve seen. I bought it used and its been going for a little over a year now with no problems at all.

    These may be the exception to the rest of their line up though. This particular panel (at least the earlier revisions) are known for their exceptional viewing angles and color reproduction.

    • puppetworx
    • 8 years ago

    IT’S HAPPENING!!

    • GhostBOT
    • 8 years ago

    At $999 i rather have 295 X2. Screen tearing isnt much of an issue when FPS is too high.

    • derFunkenstein
    • 8 years ago

    It’s also “only” about 75% more than 60Hz 1440p monitors. At the outset, you can just about guarantee that $700 is the minimum for one of these, and a reasonable price…

    …that there’s no way in hell I can afford. I feel your pain, brah.

    • cobalt
    • 8 years ago

    But a good monitor will last much longer than a good video card, so it’s quite reasonable to spend more on one. At least I think so; I’d be willing to spend $700 on a new monitor, but not on a video card.

    Also, note that the ROG Swift has the ROG branding and some extra features that they probably feel they can charge more for, so it’s possible that without things like monitor-provided crosshairs that most people don’t need, this might not be (much) more.

    (Plus, the TN panel may still be faster/less motion blur than this AHVA variant, so it’s not necessarily a win in every category, despite similar specs. Plus, there’s that IPS glow thing that their previous AHVA panels still had. Those facts might keep the cost more reasonable, depending on who finally makes the monitor.)

    Then again, maybe I’m just being optimistic…..

    (Quick edit: I probably shouldn’t have used the “VA” term; the “VA” in AHVA doesn’t stand for Vertical Alignment, so it’s not really a VA variant as most people think of them.)

    • superjawes
    • 8 years ago

    Which is why I said that it’s a waiting game for the tech to trickle down. The problem with the price is that the market sets it, and since there’s an $800 monitor out there with similar specs (2560×1440 resolution, 27″) I can’t see this monitor selling for less than $600. I hope that it will be less than the ROG G-Sync one since said ROG has the shiny new technology.

    At $800, I’d rather have G-Sync, but if they can sell this one for $700, I’d be happy without it, provided that I am getting the same resolution, size, and refresh rate in addition to the IPS-quality display.

    • nanoflower
    • 8 years ago

    Even if it turns out that this new monitor isn’t that good just the mere fact that someone has introduced a 144Hz IPS monitor (or close enough for horseshoes) will help move the market along. That’s a good thing for us all.

    • mesyn191
    • 8 years ago

    $700 is completely doable?! That is double most people’s video card budget alone! $300-ish is the max most will spend on a monitor.

    I’d also expect this thing to be well over $800 at least at launch if $800 gets you a better than average TN panel + G-Sync.

    • sweatshopking
    • 8 years ago

    i don’t mind 60hz, and my ips panel ( a 23inch hp i got for like 100$) seems ok for most games. I don’t play many shooters, mostly moba and RTS, but it seems like it works well enough.

    • superjawes
    • 8 years ago

    It’s been around for a while…but it’s also a multiple of 24 and 48, which translates to the FPS numbers for film.

    I think Youtube and similar content is 30/60 FPS, which lines up with the 60 Hz standard set by CRT refreshes and NA AC power frequencies.

    • mesyn191
    • 8 years ago

    Probably the max the panel can do.

    • GodsMadClown
    • 8 years ago

    Indeed, this sounds like a nice display, but what’s up with teh 144hz rate? It’s just a strange number. Where’d they pull that from?

    • crystall
    • 8 years ago

    I wouldn’t touch an AU Optronics panel with a ten foot pole, no matter what they say about its quality. I’ve got a ThinkPad T420 which was cursed with one of their matte panels and the quality was abysmal (no, not laptop-bad, really abysmal). I ended up replacing it with an equivalent LG display as soon as I could.

    • superjawes
    • 8 years ago

    This sounds AWESOME. My only concern is the price. I’d say that $600 would be ideal (since $800 gets you a G-Sync, better-than-other-TNs display), but $700 is probably still completely doable.

    Then it’s just a waiting game for this tech to trickle down. I have to imagine that hardcore gamers would prefer IPS displays if they could get the same refresh rates.

    • drfish
    • 8 years ago

    Sold!

Pin It on Pinterest

Share This

Share this post with your friends!