G-Sync HDR monitors might go on sale in two weeks

Not every gerbil is a gamer, but there's enough of you out there that I don't feel weird asking: how's your gaming display? It could always be better, unless you've somehow snagged a pre-release G-Sync HDR display. We first heard about the tech all the way back in January of last year, and we're still waiting for the monitors to release. The wait might soon be over, though. PC World just put up a video on its YouTube channel where the mag's executive editor Gordon Mah Ung gushes about the Acer Predator X27. During the demo, he remarks that this display will be available in about two weeks.

If you're confused as to why we're so excited about these monitors, take a moment to examine the specifications of the Predator X27. It's a 27″ display with an IPS panel in 3840×2160 resolution. The X27 uses a super-bright LED backlight with 384 local dimming zones combined with a quantum-dot filter to produce a maximum brightness of 1000 cd/m² and a stated contrast ratio of 50,000:1. Gordon Mah Ung seems to imply that this contrast ratio is a static number (as unbelievable as that is), but he stops short of actually saying it. We'll have to see once it releases.

All those specs are real fancy, but the actual interesting part of this display is that it can manage all the above image quality metrics while running at a 120-Hz refresh rate. Acer advertises the Predator X27 as being capable of 144 Hz, but Gordon Mah Ung clarifies that that's an overclocked rate, as the DisplayPort 1.4 and HDMI 2.0 connections that the monitor offers only support 3840×2160 at up to 120Hz. We heard rumblings in the past that this display would manage 144 Hz through chroma sub-sampling, but that may no longer be the case.

There's no mention on the product page of the Tobii eye-tracking hardware that this display was originally slated to include. It will be interesting to see if Acer simply neglected to mention it, or if it's been nixed from the design entirely. Certainly Tobii's hardware hasn't caught on the way the company may have hoped. It would still be a cool feature to have, though.

Some of us have been salivating over that gorgeous G-Sync HDR goodness since we first heard the announcement during CES last year, or precisely 488 days ago. Asus and Acer had apparently anticipated to launch these high-end gaming monitors last year, but they were pushed back into 2018 for whatever reason. An April launch date came and went with little fanfare. At long last, it seems these monitors might finally make it to market.

There's no mention—in PC World's video, or otherwise—of what we might expect to pay for this display. Gordon Mah Ung cheekily notes that “it won't be free,” but that might be a gross understatement if the latest rumors we heard are true. As you may recall, the Predator X27 and the similar Asus ROG Swift PG27UQ both showed up for preorder at European retailers last month. If those prices are accurate, expect to shell out around $2500 for one of these beauties.

Comments closed
    • snarfbot
    • 2 years ago

    i guarantee this thing will have abysmal contrast, typical to an ips monitor, ~1000.

    85hz at absolute best @ 4k probably even less. 120hz @ WQHD.

    local dimming will be implemented very poorly making it completely useless.

    you know how i know? because its a 27″ ips with 384 zone local dimming. there are 2 other monitors announced that are basically the same thing. the Dell UP2718Q, and the Asus PG27UQ.

    the Asus was quietly cancelled or delayed indefinitely afaik. the Dell was released though, and its horrendously bad.

    how many other 27″ 4k ips panel manufacturers are there with integrated local dimming. zero. its the same panel made by the same company, and its going to have roughly equivalent performance.

    dont get your hopes up.

    • lifestop
    • 2 years ago

    Pros:

    1. The best specs I’ve seen in a gaming monitor. If you aren’t purely focused on competitive-gaming, this is truly an awesome chunk of technology.

    Cons:

    1. Tobi Eye Tracker – I hope it’s gone from the final product, because not everyone needs it, and it just adds to the already ludicrous (speculation) price. If you really want it, you can purchase it separately.

    2. 4k – Whoa, who can run 4k at high refresh? I love that this piece of hardware seems to be somewhat future-proof, but I love me some high refresh rates! I could always drop down to the 1080p, but NO ONE that’s buying this premium beast is going to want to play at that low of a resolution. 1440p is the current sweet spot for speed AND resolution and this monitor is clearly ahead of its time. Sure, if you don’t care about framerate (or are willing to drop down to 1080p), 4k might be fine. But I want the best of both worlds! Gimme some high refresh AND better resolution. 1440p + 144hz-240hz.

    3. 120hz??? For THAT price!?! I upgraded my entire rig because Overwatch would occasionally drop from 144hz to 120hz. Yes, I could feel it, and it sucked. I’m much happier now. Lesson learned? I will not go lower than 144hz. Hell, I’m dying for the day I can move up to 240hz, but first I’m going to increase resolution because I feel like the FPS jump from 144 to 240 is going to be less impressive than the 1440p gains. I haven’t had the chance to test 240hz, but I’ve heard the stories of diminishing returns.. and frankly, I’m scared that I won’t be able to play at 144hz without it feeling like a slide-show if I try 240. 60 fps is ruined for me forever. =(

    4. Price. Yep, it’s a cool piece of tech, but it might as well not exist for most of us. Add the high price of the monitor with the extreme cost of a couple gtx 1080 ti cards to push 4k at reasonable settings and framerate, and you will need to be fairly wealthy, or simply not care about retirement.

    Conclusion: Meh. I’ll care about this kind of monitor when the price is in the $400-700 range and graphics cards that can handle 4k @144hz exist.

    Until then, I’m going to be happy with a less impressive 144hz (or higher) 1440p monitor – hopefully with some decent HDR, response time, and low input lag.

    Maybe the new AOC panels will live up to the hype? If not, I’ll probably just put off thinking about HDR for a couple more years – it’s not like there are very many HDR pc games anyway.

    It’s kind of depressing that I can buy an amazing HDR, 4K, 240hz, 55″, Nearly bezel-free television with respectable built-in speakers (who uses those?) for far less than this thing is estimated to cost. God damn, I could get a premium OLED for that price! Also, where the hell are our OLED pc monitors?

    • psuedonymous
    • 2 years ago

    There appear to be a lot of “that’s more expensive than a UHD TV, WRYYYY!!1!eleventyone!” posts.

    Beyond needing new panels, the issue is the backlight. Nobody has a high refresh rate FALD backlight, at any size. You need to drive some pretty high power LEDs, at high fidelity, at 5x or more the refresh rate of those in a UHD TV (only required to keep up with 24p content). That’s fast enough that rise and fall times of the LEDs themselves start to have some weird nonlinear behaviour, and that needs to be modelled precisely in order for the backlight to work (only open-loop control possible, there is no feedback without pointing a calibrated camera at the panel). And it all needs to be crammed into a physically smaller chassis, and work with a viewer sat much closer to the panel (e.g. you need to deal with the sharp transition of illumination axis between the edges of two FALD zones), and do so with a much lower latency.

    Unlike with a UHD TV, this is a technical capability [i<]beyond[/i<] that of the obscenely expensive studio reference monitors that go for 5+ figures.

      • jts888
      • 2 years ago

      What are you talking about? (U)HD TVs have already been doing 120 Hz strobed backlights for the majority of a decade to enable (along with frame interpolation) so-called Soap Opera Mode.

      LEDs can and do have incredibly quick reaction times and are actually more non-linear in their voltage-color response, which is why TV makers generally try to get away with PWM-controlled (i.e., flickering/strobey) backlighting instead of voltage-modulated when they can.

      Almost everything in the display chain has been ready for UHD@120Hz for years, except the critical external connection interface.

        • psuedonymous
        • 2 years ago

        A strobed backlight is [b<]VERY[/b<] different to a FALD array. A single-plane strobed backlight has to drive one LED bar (edge-lit panel) on and off, which in many cases will be a single LED type (e.g. white phosphor, Quantum Dot). This means a single global bang-bang response. The panel itself can be driven identically to a constant backlight panel. In contrast, FALD requires several hundred drivers (if using QLED or phosphor, over a thousand for individual RGB driving). The LEDs are never just turned on and off for a defined pulse, they need to hit a certain exact brightness level. You also need a huge amount of extra processing, as the backlight and panel can no longer be considered seperately: to get a desired pixel output luminance requires a combination of the subpixel transmissivity AND the emission from the bacjlight (which due to inter-zone blending means looking at multiple LEDs). All of these change at different rates nonlinearly. And then you have VRR on top of that, and you get even more headaches. Remember with the early Freesync panels, where Freesync panels lacked dynamic pixel response tuning for off-normal refresh rates [url=https://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Gaming-Experience-FreeSync-<]which resulted in edge ghosting/blurring[/url<]? You now have [i<]two[/i<] nonlinear elements you need to both hit the desired value at the desired time in order to have any given pixel output the correct value.

          • snarfbot
          • 2 years ago

          yea so? look up what tv manufacturers call black frame insertion. samsung 2018 tv’s support freesync too. the first of many, probably.

          this is a solved problem, they aren’t doing anything revolutionary here.

          • Kretschmer
          • 2 years ago

          Thank you for the thorough elaboration!

            • snarfbot
            • 2 years ago

            its going to be the same as the Dell UP2718Q, or Asus PG27UQ, with a different badge on it.

            if it is even ever released, they were all announced around the same time over a year ago and only the dell was shipped.

            theres alot of bs marketing talk guys here talking about local dimming like its some new discovery that acer in this case and nvidia just cooked up to save pc gaming. its not its very old, although you wouldnt know that based on the way it was implemented for this monitor, for a solved problem they did a terrible job.

            its possible they got a new firmware that can control the leds better than the one that dell shipped. but i doubt it.

            • Kretschmer
            • 2 years ago

            That Dell is a completely different panel and product.

            • snarfbot
            • 2 years ago

            no it isn’t

            • Kretschmer
            • 2 years ago

            The Dell UP2718Q is 60hz. The monitors discussed in this article are 120+Hz.

            • snarfbot
            • 2 years ago

            yea, well see about that.

            • psuedonymous
            • 2 years ago

            The PG27UQ uses an AUO M270QAN02.2 panel. The Dell uses an LG LM270WR6-SPA1 panel.

      • techguy
      • 2 years ago

      Your final point about needing to cram all that tech into such a small space is precisely the cause of the complaint, from this point of view. 27″ is entirely too small of a display for $2500. The featureset is great, the price and size are not.

        • psuedonymous
        • 2 years ago

        Personally if the displays were around the 32″ish mark I’d buy one in a heartbeat even at the rumoured £1000+ pricepoint. I wouldn’t go smaller than the current 32″ ultrawide (the 27″ would effectively be cutting the sides off), and the BFGDs are just [i<]too[/i<] big to use on a desk - and I can't feasibly re-arrange the room to suit a TV sat a metre or two behind the desk) - so the still-MIA HDR ultrawides would be the best option if they ever turn up. Keeping the same resolution would feel like a sidegrade though, even if everything other than resolution is a vast improvement, it's resolution that has the biggest non-gaming day-to-day impact.

    • Tristan
    • 2 years ago

    maybe from acer, asus will launch it in june

    • DPete27
    • 2 years ago

    1000nits would blind me. Nobody is going to use that monitor with the backlight set that bright unless the sun is shining through your window on the monitor.

      • cpucrust
      • 2 years ago

      A terrible itch would ensue as well

      • Cuhulin
      • 2 years ago

      The key to brightness measurements with most HDR displays is that the wider range of colors also means that only a smaller portion of the screen displays maximum brightness at any given time.

      Explosions, headlights coming straight at you, and the like are brighter, but the overall display may or may not be – depending on what is on the screen.

      • Kretschmer
      • 2 years ago

      The extra brightness is useful with ULMB strobing mode, which cuts luminescence by ~60%.

    • Freon
    • 2 years ago

    I don’t see why this should cost more than a grand or so.

    • Kretschmer
    • 2 years ago

    Why is anyone pushing 4K for gaming? Current cards (at inflated prices, too) can’t push that many pixels at the 120Hz these monitors are being sold for, and the mainstream 2560×1440 or 1920×1080 resolutions are being completely ignored.

    Give me a 2560×1440 or 3440×1440 display with consistently-fast response times and enough brightness to comfortably run strobing, please. Add in GSync working with said strobing to entice the big bucks from my wallet. Once you have this baseline mastered, then work in HDR.

    Until this happens, we’re all stuck using 2015 monitor tech. My X34 from 2015 barely has any upgrades available on the current market. Everything is either dog-slow MVA panels or barely-incremented tech that does little to advance color or motion clarity.

    • brucethemoose
    • 2 years ago

    People say RAM prices are absurd. I say PC monitor prices are.

    I’m sitting on a $400 27″, 1440p, 110hz IPS monitor from 2011. 7 years later, I can [i<]barely[/i<] get something equivalent for the price, and a few extra features quintuples the cost. Meanwhile, TVs have been steadily getting cheaper over the years.

      • cygnus1
      • 2 years ago

      Been trending this way for a while with monitors. I think they’ve been slowly working to raise average selling prices as new features get added, very much resisting the race to the bottom that happens with features on TVs.

        • brucethemoose
        • 2 years ago

        Yeah, the TV market is just so competitive.

        Is there a batsignal for those Korean manufacturers? Or maybe Monoprice or Nixeus? 120hz input, VRR and a decent backlight don’t quadruple the BoM, so it seems like there’s room for some hero manufacturer to come in and undercut the “gaming” monitor market.

      • stefem
      • 2 years ago

      Well, probably the current offering perform quite better than your 7 years old panel

      • lifestop
      • 2 years ago

      Yep. I’m almost tempted to just buy another television and set it up 4 feet behind my desk.. input lag would probably be an issue, but I’m going to test it out with my existing television for fun.

    • JosiahBradley
    • 2 years ago

    G-sync needs to die already. And yes I own a 1080 ti and can fully push these things but I ain’t paying the tax.

      • Chrispy_
      • 2 years ago

      I concur.

      Not only do G-Sync monitors fragment the market unnecessarily, they also mean that Nvidia GPU users get screwed when using normal monitors, since most >60Hz displays do not play well with Nvidia GPUs and frameskip at anything other than 60 and 120Hz for a world of juddery microstuttering.

      Even if you can’t afford a G-Sync monitor, it would be amazeballs to get 100fps at 100Hz on a standard monitor with an Nvidia card. I’ve yet to see an NV card + fixed refresh display operate correctly in the last 5+ years and I’m pretty sure it’s because they’re trying to force people into buying G-Sync.

      It doesn’t even need a high-refresh display, I’ve failed to get >60Hz on my Nvidia cards (960, 970, 980, 1060) using [b<]any[/b<] overclocked monitors. Many 60Hz monitors can reach 70 or even 75Hz and that is a decent upgrade if you have an AMD card. Sadly, every Nvidia user probably [b<]*thinks*[/b<] they're getting 75Hz out of their overclocked monitor but it's not smooth because they're [url=https://www.testufo.com/frameskipping<]just skipping frames[/url<] :'(

        • synthtel2
        • 2 years ago

        144 works, and 60/120/144 covers most monitors. It’s definitely still obnoxious, but a lot less so than just 60/120 would be.

          • Chrispy_
          • 2 years ago

          I’m surprised it works at 144Hz. Not that I doubt you, it’s just means it’s even stupider that Nvidia can’t do 72Hz.

          I returned my Predator Z1 so the highest refresh screen I have to test with now is 120Hz.

        • stefem
        • 2 years ago

        WTF are you talking about, NVIDIA GPUs isn’t skipping frame, they are inserting special frame with subliminal message to brainwash and control your mind! NVIDIA has been founded by Hitler’s son who is now pursuing its fathers project!

        Even excluding your tinfoil theories you talk like you tested every monitor available on earth, com’on… I never had an issue with 60Hz monitors and NVIDIA GPUs

        • Kretschmer
        • 2 years ago

        I’ve run my MG279Q at different refresh rates (100Hz/120Hz/144Hz) with my 1080Ti and didn’t see frame skipping on the Blurbusters tests. Are you talking about your own overclocking failing (e.g. trying to run a 60Hz monitor at 70Hz) or are you saying that any >60Hz refresh rate doesn’t work with Nvidia GPUs?

          • K-L-Waster
          • 2 years ago

          Gotta concur with this — it matches what I’m seeing with my 1080TI as well.

      • AMD718
      • 2 years ago

      Same here. 1080 Ti but not buying G-Sync.

      • Kretschmer
      • 2 years ago

      Is my 1080Ti and Techreport’s 1080Ti defective? I wouldn’t dream of trying to push 4K@120Hz, yet many people are popping up here expecting to run games at 4K and 100+Hz.

      [url<]https://techreport.com/review/31562/nvidia-geforce-gtx-1080-ti-graphics-card-reviewed/13[/url<]

    • Bauxite
    • 2 years ago

    Over a year late and $500 above the already way overpriced original “estimates”.

    Seems like a token DOA launch just for the sake of trying to get ahead of TVs getting better inputs and features soon.

    • 223 Fan
    • 2 years ago

    To be fair the monitor is called the Predator. Now we know what (who) is the prey.

      • tay
      • 2 years ago

      *golfclap*

      • cpucrust
      • 2 years ago

      “The way you are meant to be preyed”

    • sweatshopking
    • 2 years ago

    299 Cad and we have a deal. Any more and wife wont give permission.
    I sure wish Nvidia would support freesync on my 1060…

      • Voldenuit
      • 2 years ago

      This is what you get for enabling 2FA on the wife.

      • Kretschmer
      • 2 years ago

      Just 10 easy payments of $299 CAD! Cough a bit when you say “a month” after the $299, and she’ll never be the wiser.

      • Firestarter
      • 2 years ago

      freesync? That sounds awfully communist!

        • K-L-Waster
        • 2 years ago

        Reasonably affordable sync?

    • techguy
    • 2 years ago

    High end PC monitors are some of the most absurdly-priced components in the PC ecosystem. $2500 for a 27″ monitor! I spent less than that on a 70″ 4k HDR TV with local dimming. Not a chance Asus.

      • Chrispy_
      • 2 years ago

      Yeah, I said in another post that this is well into OLED price territory.

      At $2500 I’m looking at a 55″, 120Hz OLED 4K HDR TV with rolling scan technology to reduce image persistance. Sadly that’s interpolated from a 60Hz source, but there were native 120Hz OLED panels that were shown at CES 2018, I’m expecting them to go on sale soon.

        • PixelArmy
        • 2 years ago

        Please use that 55″ screen from 2 feet away, on your desk. Apples and oranges…

        (Not saying this is a good deal)

          • Chrispy_
          • 2 years ago

          Point being that 55″ screens are way more expensive than 27″ screens, so this is a biased example hugely [i<]in G-Sync's favour[/i<], yet it still costs WAAAAAAAAAAAY too much. At the suggested $2500, 1080Ti owners can afford to throw their 1080Ti in the trash, buy two Vega64's to run in crossfire, add on a Freesync 4K HDR monitor and still have change left to celebrate. Sure, the 1080Ti is better than Vega64, but that doesn't mean you should lock yourself into a [b<]5x markup on monitor prices[/b<] just for the sake of G-Sync, which is a stupid way to do VRR in the first place and only helps Nvidia perpetuate this ridiculous money-maker.

            • PixelArmy
            • 2 years ago

            I got your point, I’m saying it is not relevant in this context.

            And please link to the monitor that checks all these:
            * Freesync
            * 4K
            * HDR
            * IPS
            * 120+ Hz

            According to [url<]https://www.blurbusters.com/freesync/list-of-freesync-monitors/[/url<] it does not exist.

            • Kretschmer
            • 2 years ago

            Can you link to a FreeSync monitor with similar specs? You’re not paying this markup for GSync, you’re paying this markup for one of the first desktop monitors on earth with 4K/144Hz/HDR.

            • Chrispy_
            • 2 years ago

            Well no, because it’s not possible and if you read the article it doesn’t look like this monitor can do it either, it’s a bandwidth limitation of the current displayport/HDMI standards so 120Hz is the limit.

            4K/Freesync/HDR and WQHD/Freesync/HDR are commonplace though, Asus stop at 100Hz right now and most of the LG/Samsung offerings are 75Hz at most.

            • Kretschmer
            • 2 years ago

            What is the Asus 100Hz 4K HDR model?

            • Chrispy_
            • 2 years ago

            Where, in ANY of my replies do I mention 100Hz 4K HDR? You need to improve at reading comprehension.

            As for 100Hz WQHD HDR, there are both Freesync and G-Sync options from Asus PG34, PG35 ranges and that’s just a small sample of what’s out there including the Benq EX3501R, the insane 144Hz HDR Samsung 49CHG90, 100Hz HDR Samsung C34F791. If anyone’s lagging behind at the moment, it’s Acer.

            • Kretschmer
            • 2 years ago

            You need to improve your conversational relevance. 🙂

            Initially you complain about this monitor being at a 400% premium because GSync bad grrr.

            [quote<]At the suggested $2500, 1080Ti owners can afford to throw their 1080Ti in the trash, buy two Vega64's to run in crossfire, add on a Freesync 4K HDR monitor and still have change left to celebrate. [/quote<] I noted that this monitor is literally the first in the world to offer 4K/144Hz(OCed)/HDR, and that buyers are paying a premium for the specs, not GSync. You then insisted that 75Hz and 100Hz options are available at much cheaper prices. It turns out that you were comparing Apples to ($2,500) Oranges, as none of those models are 4K/144Hz/HDR. So, I'm still waiting to understand why you're so incensed at the GSync module for this monitor being so expensive. If you're just frustrated in general that there are too few GSync offerings at too high a price premium, I'm right next to you. My X34 has aged *depressingly* well in the GSync marketplace.

            • PixelArmy
            • 2 years ago

            You’ve run into the master of the straw man. Rather than discuss the product at hand, VRR/4k/120+Hz/HDR/IPS(like), he’s redirected the conversation to VRR/< 4k/< 120 Hz/HDR/IPS(like).

            And trio of monitors that comprise the “_small_ sample” is nearly the _entire_ sample… All hyperbole. In fact, an Asus with FreeSync/WQHD/100Hz/HDR, AFAIK don’t actually exist. (Links could prove me wrong, but as you can see they’ve been resisted thus far, most likely for the obvious reason).

            • Chrispy_
            • 2 years ago

            You guys seem to have an agenda. All I started with is in saying that this stuff is the price territory of OLED TVs that do 4K/120Hz/HDR (within the limits of compressed stream decode, because HDMI 2.0 simply lacks enough bandwidth to go above 4K/60Hz at 4:4:4)

            OLED delivers vastly better contrast/viewing-angles/gamut coverage/response times and yet somehow you’re desperately trying to get me to link something that doesn’t exist yet (a 4K/high-refresh/VRR/HDR monitor). You can easily get any three of those four things at once – as I’ve been saying – but since HDR is a newer feature, the existing 2017 options lack that.

            My issue (and original comment) which you seem intent on derailing, is that there’s no reason for this product to be so damned expensive. High-refresh VA panels with HDR were the mainstay focus for new monitors from Samsung, AOC, Acer, Asus, and even new entrants like MSI all throughout 2017.

            • Kretschmer
            • 2 years ago

            I agree that bleeding-edge (and quite frankly 2015-era) gaming monitors command a huge premium when compared to televisions and pedestrian monitors and welcome competition from OLED TVs with VRR. If that’s your point, I don’t understand this quote:

            [quote<]5x markup on monitor prices just for the sake of G-Sync, which is a stupid way to do VRR in the first place and only helps Nvidia perpetuate this ridiculous money-maker.[/quote<] You're implying that GSync is what makes these monitors so much more expensive than televisions, when the premium for cutting-edge DPI/refresh rates/VRR affects both GSync and FreeSync options. Which is why we asked you to produce non-GSync SKUs of the same thing for 20% of the price. [quote<]My issue (and original comment) which you seem intent on derailing, is that there's no reason for this product to be so damned expensive. High-refresh VA panels with HDR were the mainstay focus for new monitors from Samsung, AOC, Acer, Asus, and even new entrants like MSI all throughout 2017.[/quote<] High-refresh-rate panels are not fungible. Obviously you're going to pay a premium for the very first 4K panels over slower and lower-res VA offerings. I mean, are you shocked that 1440P panels command a premium over 1080P panels or that TN goes for less than IPS? Wait a year for the AOCs of the world to get their hands on these panels and sell monitors for half the price. Hope this explanation helps!

            • PixelArmy
            • 2 years ago

            “You can easily get any [b<]three of those four[/b<] things at once" "There's no reason for this product to be so damned expensive" The 3 out of 4 comment is why this whole thing is irrelevant, we're taking about something that checks [i<]all[/i<] the boxes, is basically (to borrow an HR term) a unicorn hence the price. And as for links, there's 2 types that have been requested: a) rhetorically asking for an existing monitor that matches the "unicorn" b) links backing [b<]your claims[/b<] of "4K/Freesync/HDR and WQHD/Freesync/HDR are commonplace though, Asus stop at 100Hz". "As for [b<]100Hz WQHD HDR, there are both Freesync and G-Sync options from Asus[/b<]" "You can [b<]easily[/b<] get any three of those four things at once" So easy you have 3 whole examples, not counting the Asus' you refer to that don't exist. Agenda: wanting some semblance of a truthful, discussion w/o made up products.

            • Kretschmer
            • 2 years ago

            What’s weird is that he had a valid point (PC Gaming monitors are stupidly expensive) and then wandered off the path into blaming premium pricing for bleeding-edge tech on the GSync module. Taken with the falsehoods above about non-GSync monitors having issues with Nvidia GPUs at >60Hz refresh rates, I’m really scratching my head about the GSync hate. I’ve used GSync and FreeSync1 and prefer GSync, but it’s not a religion.

            • Chrispy_
            • 2 years ago

            True, I re-read my original post and do blame G-Sync.

            Whilst it’s definitely responsible for an unnecessary “premium” it’s not enough by itself the justify the stupid rumoured price of this monitor. As for the unicorn pricing on this thing, I think my complaints are that although this is the first product to tie all of those four mentioned features together in one package, none of those features individually cost much more than omitting them. Sure, I’m expecting it to be expensive – because high-refresh can double the cost compared to an equivalent low-refresh monitor, but that still doesn’t explain the costs. IMO, I was expecting something more like this:

            ~$450 for a base 28″ 4K monitor. Samsung/Asus/Acer/Benq all have options at or below this price

            Double it to $900 for high-refresh panel. Yes, that’s a rip-off, but it’s the going rate for these things.

            Add $200 for G-Sync, that takes it to $1100.

            Add another 20% for unicorn, or G4M3R-Bling RGBLED additional tat and gimmicks. Whetever!

            It’s still nowhere near $2500 and I’ve already been generous with my cost estimates here.

            • K-L-Waster
            • 2 years ago

            How much for HDR and Quantum Dot at those pixel densities and with those refresh rates?

            My take on it is that getting one panel to do all of those things well at the same time is the real challenge.

            • Kretschmer
            • 2 years ago

            The complexity of these features is likely multiplicative, instead of additive. It’s not just HDR, but HDR that can run at multiple times the refresh rate of a standard TV. Oh, and that HDR backlight can’t rely on a fixed refresh rate – content might require it to run at 67 Hz one second and 119 Hz the next second. So you need to adapt the technology. And that’s just one feature – getting it all to work together must be an engineering marvel.

            Acer and Asus have to search for the panels that work with all this functionality without failing to hit the refresh rate, without exhibiting excessive backlight bleed, etc. – we’ll be able to buy the “B” panels from AOC or MSI soon enough. 😉

            And there’s no economy of scale. Samsung will sell a million TVs of one SKU, but these bleeding edge computer monitors are probably looking at tens of thousands of units for 2018.

            While there’s certainly a premium for early adopters, it’s likely smaller than the initial sticker shock suggests. And even if these things were cheap to make, there are enough “sky is the limit” PC gamers to sell the first production runs at whatever price Asus and Acer desire. That guy with a Volta Titan and 64GB RAM needs a monitor to properly display League of Legends and CS:GO…

            I personally find these prices ludicrous and am waiting for them to come down to earth. The problem with PC monitors is that this doesn’t seem to be happening…

            • stefem
            • 2 years ago

            I’m not sure (I may be even sceptical) that a 4K 55″ panel cost much less than a 27″ 4K high refresh panel (especially for OLED), pixel density (which moves along with pixel size) is a mayor challenge at the required performance.

          • FranzVonPapen
          • 2 years ago

          I’m doing that right now. Got a 4K 55″ almost exactly two feet from my face.

            • Chrispy_
            • 2 years ago

            IMO it’s vastly superior to a multi-monitor setup of two 27″ screens, because there’s no bezel interrupting your view.

      • gerryg
      • 2 years ago

      Pssh, that’s pocket change for me and my Beverly Hills peeps. I need one for my Tesla while it ferries me around. Just no room for it in the Ferrari.

      • Laykun
      • 2 years ago

      “if the latest [b<]rumors[/b<] we heard are true"

      • albundy
      • 2 years ago

      hah, i’ve spent less than that on a 65″ OLED 4k HDR tv!

        • techguy
        • 2 years ago

        Mine’s bigger

    • Chrispy_
    • 2 years ago

    Why is G-Sync so far behind the technology curve.

    HDR and G-Sync together is nice, but Freesync displays have done this for at least a year now, and they [i<]also[/i<] come in ultrawide, curved, and ultrawide curved variants as well - none of which cost $2500. Hell for that money, OLED or bust.

      • Techonomics
      • 2 years ago

      [quote<]Hell for that money, OLED or bust.[/quote<] That's really what it will come down to. The only things holding back OLED for desktop use at this point are panel size and image retention issues, the latter of which LG has been improving upon. If we start seeing 40-46" OLEDs (TVs or otherwise) in 2019, I think a lot of gamers will look at OLED options despite IR. LED-LCD, no matter how it's dressed up, simply cannot command the MSRP it used to. Samsung found that out last year with QLED.

        • snarfbot
        • 2 years ago

        we will definitely see smaller screens because lg built a new factory making 8k oled panels.

        so if the smallest panel 8k is 55″ then we can expect ~27″ 4k oled monitors and up, thats if they decide to go that route. they could have been selling 1080p oled monitors for the last 4 years and they havent.

        like you said probably because of image retention, i guess well see.

      • auxy
      • 2 years ago

      There are 120Hz 4K Freesync monitors? (‘ω’)

      • stefem
      • 2 years ago

      [quote<]HDR and G-Sync together is nice, but Freesync displays have done this for at least a year now [/quote<] Yep and they suck, on many fronts. HDR is not yet (properly) there for PC

      • Durante
      • 2 years ago

      There isn’t a single monitor out there right now that remotely matches these.

      Slapping a “HDR” label on something that doesn’t have the requisite contrast ratio or maximum brightness doesn’t make it more advanced.

    • uni-mitation
    • 2 years ago

    Am I the only one that is a bit uncomfortable with the idea of eye-tracking hardware/software? It feels just like a prerequisite to those telescreens, albeit very high definition ones for those two minutes of hate.

    uni-mitation

      • cpucrust
      • 2 years ago

      I’m concerned that this is simply another way for software/hardware to monetize private data. Games that have support will be a standout feature, but then newer titles will require to be connected to a servers to submit user metrics, in order to enhance the software experience, of course…

        • uni-mitation
        • 2 years ago

        It could be argued that in the remote possibility of a surveillance state that those tools would have already have been developed by the private sector. In fact, the assumption that private industry would be any impediment to such regime is not a given if the interests & destinies of those giants of industry are intertwined with such regime.

        And in contrast to such dystopia of fiction, I fear the current titans of industry are not that much in a rush to guarantee our civil liberties given the recent & current events that expose such complicity. And the masses as docile & complacent with their bread and gladiatorial fights.

        uni-mitation

    • K-L-Waster
    • 2 years ago

    Winner in the “Most Desirable Unattainable” category.

    • cynan
    • 2 years ago

    So how many years will I have to wait for a 3840×1600 38″ G-SYNC with HDR (for $1000 or less). 4K is great and all, but packing all those pixels into 27″ display for gaming is a bit of a waste considering the GPU horsepower requirement.

      • auxy
      • 2 years ago

      Why is it a waste?

        • Kretschmer
        • 2 years ago

        You’re buying a monitor with a 120Hz refresh rate but can only run fancy games at 60FPS with the best consumer GPUs available. It’s much smarter to buy a cheaper 1440P SKU today and then upgrade to a 4K SKU when GPU tech catches up.

        I have a 1080Ti and would consider these displays a downgrade from my X34, because I’d have to either dial down detail with a sledgehammer or putz along at 50-75 FPS.

          • auxy
          • 2 years ago

          Oh just shut up! I can run all sorts of “fancy games” at >100 FPS in 4K resolution, even compensating for the DSR inefficiency penalty!! (; ・`д・´) You don’t have to drop settings to the floor either — little compromise on AA quality is usually all it takes. All you need at 4K is SMAA anyway; anything else is just a smear filter (TAA) or impractical (MS/SSAA).
          Examples:
          Warframe (max settings = ~140 FPS avg.)
          Rise of the Tomb Raider (ultra settings without AA = ~100 FPS avg.)
          Titanfall 2 (high settings with FXAA = ~110 FPS avg.)
          Mirror’s Edge Catalyst (mix of high and ultra settings = ~100 FPS avg.)
          Overwatch (epic settings without AA = 120 FPS capped)
          Doom (nightmare settings with SMAA = ~120 FPS capped).

          And that’s not even to talk about all the less-demanding games that I play that ran fine at >100 FPS on my R9 Fury X. (`・ω・´) I keep pestering Zak to do some testing to demonstrate this but I guess it’s not up to him. I know he’s also tired of hearing the old “durr can’t run games in 4K” crap though.

            • cynan
            • 2 years ago

            I’d just prefer to sacrifice some of the pixel density for a larger screen. And I think the wide aspect, for gaming and media, further improves immersion.

            I’m not going to begrudge anyone their 4K 27″ screen, but the fact that you can’t buy, for example, a 3840×1600 38″ G-SYNC monitor doesn’t make any sense to me as I would think most people would appreciate the trade off. Heck, they are coming out with 35″ GSYNC HDR monitors with the same aspect, but infuriatingly, they do not give you the full 4K horizontal resolution (only 3440×1440).

            I think for someone with a single high end gpu – something like a 1080 Ti/1180 range – pushing 3840×1600 at 60-100 Hz depending on the game, etc, with variable refresh, would be a sweet spot considering it’s about 25% less pixels than full 4K.

            • Kretschmer
            • 2 years ago

            It’s interesting that you’re getting higher FPS at 4K than I get a 3440×1440 with a 1080Ti (e.g. Doom, Titanfall2; I’d have to check the other games). You’re either running SLI, misunderstanding your settings, or lying.

            • RAGEPRO
            • 2 years ago

            Eh, auxy isn’t a liar. It’s probably a combination of factors; for one thing, auxy is a gonzo overclocker with her CPU running bare-die and her (1080 Ti) GPU running at the ragged edge. She also runs her Windows install without any kind of security software or updates, which means no KPTI or Spectre patches for her.

            Combined with her experience in tweaking game settings I have little doubt she’s getting the numbers she says, but I would take her settings there with a grain of salt as she’s known to disable lots of effects for preference (i.e. most post-processing) yet still claim “max settings” due to having textures and such topped-out. 🙂

            • Kretschmer
            • 2 years ago

            That makes sense. I was just cranky at being told to “shut up” when every reputable review site will tell you that GPUs aren’t up for 4K at 100+ FPS without severe trade-offs.

            • psuedonymous
            • 2 years ago

            Here’s an easy trick to make modern games run at high framerates at UHD: See that setting that says “ULTRA” or “NIGHTMRE MODE” or the like? Back that off to regular old “High”. Visually identical, huge performance gain, may cause e-penis shrinkage.

            • Kretschmer
            • 2 years ago

            Yeah, I never run at “ultra” and my 1080Ti still falls below 100FPS in several AAA titles.

      • leor
      • 2 years ago

      I second this, I have a 34 inch 3440×1440 and I miss 1600 pixels of vertical space. 27 inches seems so weird, you would think at least 32 for 4k with those specs.

      • brucethemoose
      • 2 years ago

      Maybe not long?

      HDMI adaptive sync will show up in 40″ TVs relatively soon.

      • lifestop
      • 2 years ago

      I don’t understand? High PPI is the dream! Have you ever looked at your phone and marveled at its tightly packed pixels? Scaling fixes the issues with icons being too small, so what’s the problem?

      I’m honestly at a loss for why people want to buy monitors bigger than 27″. I mean, I’ve been looking for a 24″ 1440p for some time now, but there isn’t one that meets my standards.

    • Neutronbeam
    • 2 years ago

    Estimated time until monitor availability: 2 weeks
    Estimated time required to save enough to buy the monitor: 18 months

      • morphine
      • 2 years ago

      … and by then it’s already outdated anyway.

        • stefem
        • 2 years ago

        Monitors aren’t exactly like smartphones…

        • Captain Ned
        • 2 years ago

        My Dell 2007WFP begs to differ.

      • kvndoom
      • 2 years ago

      …which is why they announced it 15 months ago. You should almost be ready!

      • Kretschmer
      • 2 years ago

      At the rate of release for new monitor tech, this display should last you until 2120. So at least it’s got legs…

      • Kretschmer
      • 2 years ago

      Edit: Wrong thread.

    • deinabog
    • 2 years ago

    Two-thousand five hundred for a 27-inch screen? I’ll stick with my 2K monitors for the foreseeable future.

      • cynan
      • 2 years ago

      I say a silent prayer every night that my old Dell u3011 will still be with us when powering up the next day.

      With these prices, I don’t relish the prospect of shopping for a new high-end G-SYNC gaming monitor at the moment, and if forced, would probably end up with a Freesync model given the options, despite having an Nvidia GPU.

      • Srsly_Bro
      • 2 years ago

      It’s only .5 Ks more.

Pin It on Pinterest

Share This