LG puts a price on HDR with $1000 32UD99-W

HDR is coming into its own in the television world, but the new color standard is just beginning to make its way into the PC side of things. Right now, each and every display that supports HDR is noteworthy. LG's new 4K, 32-inch 32UD99-W monitor was first displayed during CES earlier this year and is now available.

The HDR10-compliant 31.5" IPS display shows off 95% of the DCI-P3 color gamut across its screen. The 32UD99-W offers onboard color-space switching, a feature that'll probably come in handy for graphics professionals. Seeing as HDR is the headlining characteristic of this display, it'll light up your room a bit better than your run-of-the-mill screen. LG says that peak brightness should hit 550 cd/m², compared to the 350 cd/m² offered by most screens. The typical contrast ratio is 1300:1, an upgrade compared to the common 800:1 or 1000:1 figures.

While the 32UD99-W might be a great monitor for graphics work or watching videos, gamers might be put off by the 5-ms response time and 60Hz maximum refresh rate. The monitor does offer FreeSync support in the 40Hz to 60Hz range, and a Game Mode that disables any post-processing.

For inputs, the 32UD99-W offers a USB Type-C port, a DisplayPort connector, and two HDMI 2.0a ports. There are also two USB 3.0 ports as an added convenience. The stand offers plenty of movement with height, angle, and pivot adjustments. LG packed in a pair of 5-watt speakers in case you don't already have a set on hand. On the software side, you get screen splitting, picture-in-picture, and color and game presets via the included utility.

The LG 32UD99-W is available right now for $999 at BH Photo & Video, Adorama, or Amazon.

Comments closed
    • EndlessWaves
    • 3 years ago

    No mention of local dimming, so this probably isn’t an HDR screen – just a standard wide gamut monitor that supports PQ.

    • Chrispy_
    • 3 years ago

    [quote<]a Game Mode that disables any post-processing[/quote<] Ugh, why do monitors even [i<]have[/i<] postprocessing in the first place?! A monitor has one job. [b<]ONE. JOB. To display the output from the graphics card as accurately as possible. [/b<] Anything the monitor adds to the signal [i<]at all[/i<] reduces the accuracy, and obviously adds processing lag. How does any moron involved with monitor design consider "accuracy reduction" and "lag" a feature worth spending time on?

      • cygnus1
      • 3 years ago

      Honestly, I wouldn’t be surprised if the guts of this are mostly re-purposed TV parts. The only thing electronics wise that really differentiate this from a TV is the DisplayPort and maybe the USB hub functionality (most TVs instead of a hub it’s just downstream ports for the smart features). FreeSync (especially in the 60hz and under variety) is probably a free feature at this point, built into all their controllers, that just has to be flipped on in firmware.

      • Kurotetsu
      • 3 years ago

      That is rather strange for a PC monitor. I could understand if this is were a TV, but why would something that is clearly meant for professional work need a bunch of post-processing? Or even a Game Mode for that matter?

    • Wirko
    • 3 years ago

    To calculate the price of HDR alone, we can subtract the 32UD89-W from the 32UD99-W. That’s around 150 EUR in Germany. The only difference, beside HDR, is USB 3.0 vs. 2.0.

    • Major-Failure
    • 3 years ago

    Either I’m blind or the resolution specification is too well hidden in the text. For anyone wondering, it’s 3840 x 2160 Pixel (UHD) and *not* 4096 Γ— 2160 (DCI 4K).

    Also, who here runs their screen at 100% brightness? I’ve been running my screens at 5 to 10% brightness for years and never understood why it has to be so bright at 100%. And no, I’m not in a basement πŸ˜‰

      • ronch
      • 3 years ago

      Because these expensive displays are not just Retinaβ„’ displays, they burn your retina out. And you need to crank the brightness level up to 11 to do that.

      • Chrispy_
      • 3 years ago

      4K has meant 3820 x 2160 for LCD panels since the beginning.

      Where are you seeing 4096 x 2160 panels? Do they even exist outside of cinema and high-end consumer projectors?

      They’re certainly nothing I’ve seen in the PC or television market, all of which are either 16:9, 21:9 or 2.39:1

        • derFunkenstein
        • 3 years ago

        The 4K 21.5″ iMac has a 4096×2304 panel, and an effective resolution of 2048×1152.

        [url<]http://www.apple.com/imac/specs/[/url<]

          • Waco
          • 3 years ago

          I will never understand these oddball resolutions.

          That said, if Windows (and Windows applications/games) wasn’t so *bad* with scaling, the oddball resolutions wouldn’t matter.

            • derFunkenstein
            • 3 years ago

            I think Microsoft messed up when they went for percentages instead of effective resolutions. Say my 15.6″ laptop has a 2560×1440 panel, and I want everything to appear as if its as 1920×1080. While the actual resolutions are different on a Mac, the process in macOS is that it would draw an image of 3820×2160 with everything twice as large as it would normally be (100% larger, which is brain-dead simple) and then scale that image down to 2560×1440.

            That’s how the MacBook Pro can have effective resolutions up to 1680×1050 on a 2560×1600 display ([url=http://www.apple.com/macbook-pro/specs/<]see here[/url<]).

            • Waco
            • 3 years ago

            Yep. I wish there was an elegant way to do that in Windows. It’s annoying with a 4K screen how many things either don’t support DPI scaling at all (and have horridly tiny UIs) or how they mess it up when it’s on anything other than a linear scale.

            That, and many applications just fail to run at all if DPI scaling is enabled for the Windows UI (regardless of whether you check the “disable DPI scaling” box for the executable in question).

            Sigh.

            • derFunkenstein
            • 3 years ago

            I’m sure the reason Microsoft chose the route they chose is that doing what Apple did is the most graphically-expensive method to do it. All of Apple’s machines that support this at least have Crystal Well-enhanced graphics (GT3e), if not discrete chips.

            • EndlessWaves
            • 3 years ago

            Microsoft’s approach is the better one for picture quality. Giving programs access to the exact pixel grid and specifying sizing allows then to do per-pixel (and even per-subpixel) effects.

            Apple’s method of rendering for 5180×2880 for 2560×1440 sizing and then scaling down to 3840×2160 screen resolution produces the same loss of sharpness/exactness as any scaling does.

            • derFunkenstein
            • 3 years ago

            It’s not good for picture quality. Take the lowly Windows Forms app. For something that is so incredibly under Microsoft’s control, the form designer and scaling just don’t get along. It’s awful. It’s why I don’t do my VB homework on my laptop, because the forms I design that look great at 125% are horrifying pieces of shit on a 100% desktop monitor.

            • Waco
            • 3 years ago

            I’m sure, but anyone with a high DPI display in Windows probably has a decent enough GPU to do display scaling. Hell, Intel integrated has been good enough for this for years.

            • derFunkenstein
            • 3 years ago

            That’s probably true. There’s always a chance Microsoft just never thought of it on their own (although that itself seems silly). The Retina MacBooks came out in late 2012, same time as Windows 8. Windows 10 seems to be using a lot of the same scaling techniques baked into Windows 8. It could be that Windows 8 and the underlying tech was too far along to make a change, even if MS wanted to.

            • UberGerbil
            • 3 years ago

            I’m still using an seven-year-old Dell SP2309W @ 2048×1152. It’s actually a pretty decent resolution for a lot of things — you can work on or play back 1080 content with room for a toolbar or playback controls. Though I now have it in a vertical orientation in a multi-monitor setup, where it works well for a lot of text-heavy applications.

        • the
        • 3 years ago

        [url=http://www.lg.com/us/monitors/lg-31MU97-B-4k-ips-led-monitor<]LG makes a 4096 x 2160 model.[/url<] I presume there are a few others that use the same panel.

      • EndlessWaves
      • 3 years ago

      The extra brightness in HDR is designed for backlights that can be varied across the screen. The average brightness is supposed to stay the same.

      It’s essentially increasing contrast by boosting the bright spots rather than the normal approach of darkening the black spots.

        • Airmantharp
        • 3 years ago

        In all honesty, it should be *both*. HDR should be both brighter *and* darker than what we’re used to, at the same time.

    • deruberhanyok
    • 3 years ago

    Getting closer… just add 120hz refresh to that feature list and I’ll have found a new monitor. πŸ™‚

    • Sargent Duck
    • 3 years ago

    How about dropping the crappy 5-watt and shaving $10 bucks off the price?

      • NarwhaleAu
      • 3 years ago

      You mean $2?

        • derFunkenstein
        • 3 years ago

        Assume a markup of several thousand percent. Let’s call it an even $50.

          • NoOne ButMe
          • 3 years ago

          But do you think the manufacturer will try to keep the margin the same, raise it, or lower it?

            • derFunkenstein
            • 3 years ago

            If anything, I’d expect them to offer a speaker-free version for the same price. It costs money to pay someone to remove features from the firmware. πŸ˜†

            • NoOne ButMe
            • 3 years ago

            And sell it as a more power efficient model probably.

            • derFunkenstein
            • 3 years ago

            There’s a future for you in marketing.

            • NoOne ButMe
            • 3 years ago

            I happen to know many more people in the advertising business than I care to admit.

      • Airmantharp
      • 3 years ago

      If it is an output device for HDMI, it needs speakers.

    • nico1982
    • 3 years ago

    I didn’t expect this to be available this early. It is a steal over the UP3216Q. I’m on the fence betweeen this and the Ultrafine 5K as an U2713H replacement.

    Too bad USB-PD is limited to 60W.

      • morphine
      • 3 years ago

      [quote<]Too bad USB-PD is limited to 60W.[/quote<] Hardly a deal-breaker characteristic?

        • nico1982
        • 3 years ago

        I think it depends. Right now is probably not a deal breaker for most users. For me is something to consider as my laptop pulls up to 85 Watt. More broadly speaking, monitor are expensive purchases that will likely survive one or more PCs. I’d prefer them to be more future proof than saving a few dozen $ over 1000.

    • brucethemoose
    • 3 years ago

    Eh… The real monitors to wait for are the Samsung CHG70/CHG75

    -1440p
    -27″ or 31.5″
    -144hz
    -Gsync or Freesync
    -And most importantly, quantum dot VA (which means better contrast than this LG monitor).

    550 nits on an IPS screen with no local dimming doesn’t really count as HDR.

      • tay
      • 3 years ago

      I like the VA panels for their blacks but the Samsung CF791 has had a lot of issues with FreeSync. Sad!!

        • brucethemoose
        • 3 years ago

        It’s “Freesync 2”, so maybe they fixed some of the issues.

        • funko
        • 3 years ago

        but Samsung has all the greatest panels! Great guys. Brilliant. IPS is fake tech! Just a brand name! Overrated and not impressive at all!!

      • pdjblum
      • 3 years ago

      thanks for that heads up

      the 31.5″ of those has just the specs I have been looking for on the freesync side

      • nico1982
      • 3 years ago

      [quote<]550 nits on an IPS screen with no local dimming doesn't really count as HDR.[/quote<] I like standards more than opinions πŸ˜›

        • NoOne ButMe
        • 3 years ago

        HDR10 standard is 1000 nits (spec is up to 4000) Dolby HDR is 4000 nits (spec is up to 10,000).
        “ULTRAHD premium” is 1000/540 nits for LED/OLED.

        550 I barely half of the more popular spec.
        [url<]http://www.eurogamer.net/articles/digitalfoundry-2016-the-best-4k-screens-for-hdr-gaming[/url<] Looks at some HDR 4K. Sadly all TVs due to digital foundry covering consoles. And a lack of HDR consumer monitors.

          • Quiet Sun
          • 3 years ago

          This monitor may accept a HDR10 signal, but it cannot display HDR in any meaningful sense.

          • nico1982
          • 3 years ago

          Cool. Now I sounded like a smartass πŸ™‚

          Tftcentral covered the issue, and the 32UD99 specifically, in their excellent HDR article [url<]http://www.tftcentral.co.uk/articles/hdr.htm[/url<]

            • NoOne ButMe
            • 3 years ago

            Thank you for the article stating HDR10 is premiumHdr.

            I was not 100% sure.

            • EndlessWaves
            • 3 years ago

            It isn’t.

            HDR10 is a content format. UHD Premium is a hardware certification program.

            A UHD Premium label doesn’t mean that a TV can show the entire range that HDR10 can encode. Notably there’s no requirement for local dimming in UHD Premium and the colour space is much smaller.

            UHD Premium is best thought of as a stepping stone on the road to full HDR. If a screen clears that bar then it’s likely a reasonable entry level/early HDR model.

            I’m not quite sure where this claim of HDR10 being 4000 comes from. As far as I know it uses ST.2084 which is 10,000cd/mΒ²

            • NoOne ButMe
            • 3 years ago

            Thanks again for the clarification…

            A little more digging by me seems to have the 4K Nits being a mistake the easy sources I found made.

            It seems to have been made by HDR10 content being mastered at 4K nits. And sites seeing 4K and running with it.

            10000 nits should indeed be for both Dobly and HDR10

      • Major-Failure
      • 3 years ago

      1440p is not an option if you want high quality upscale of 1080p content (blu-ray playback, as well as the content people using this screen will be producing). For that you need 2160p (i.e., 2 x 1080p).

      • Chrispy_
      • 3 years ago

      I have been using IPS for several years because VA fell out of favour with the manufacturers for a while whilst IPS was getting all the investment and iLove from the Apple/LG.Philips relationship. Now that good quality VA panels have been back on the market, there is no contest:

      [list<][*<]You can argue that IPS has better viewing angles but it genuinely doesn't - at modest angles where VA starts to lose colour accuracy, IPS is well into corner-glow affecting large portions of the screen. This ruins contrast, accuracy of dark colours and of course totally trashes the screen uniformity. Admittedly there are some IPS panels that use polarisers to minimise corner-glow but these has fallen out of favour because they reduce contrast, something IPS is troubled by even before the addition of extra polarisers. [/*<][*<]You can argue that IPS has better response times, but there is no evidence to back that up and plenty of evidence to suggest that they're similar [i<]on average[/i<]. There are no 200Hz IPS panels, but there are 200 and 240 VA panels. Before you argue that VA can't transition that fast, I'd like to point out that neither can IPS. The fastest IPS panels are actually the 100Hz models, with an average 4-5ms response time, the 144Hz panels from LG struggle to get lower than 9ms. Meanwhile the average 200Hz VA panel (like the one used in the AOC Agons) is averaging 2.6ms. [/*<][*<]Finally you can argue that IPS is capable of higher brightness levels. This is undoubtedly true, but I challenge anyone to stare at a display brighter than 250nits for a whole day and come away without eyestrain and fatigue.[/*<][/list<] Here's hoping that 2017 is the end of low-quality panels. Regardless of the competition at the high end, can we just nuke 1366x768 TN panel manufacturers from orbit?

        • cygnus1
        • 3 years ago

        [quote<] can we just nuke 1366x768 TN panel manufacturers from orbit? [/quote<] Can't agree with this more. My work laptop is exactly that kind of panel and so it sits in a dock at home or at the office and almost never goes mobile.

      • Flying Fox
      • 3 years ago

      Any idea when these will arrive? Sounds promising. Right now I am eyeing the Dell 3417 curved.

Pin It on Pinterest

Share This