Asus unveils its first Mini-LED monitors in the ProArt PA27UCX and PA32UCX

We missed this one during CES, but back in January, Asus announced the ProArt PA32UCX display. That's a 32" monitor with some seriously high-end specs, which we'll go into in a moment. However, a 32" 16:9 display is a lot of desk space, and some individuals may not desire a display that big. Those folks might prefer the just-announced PA27UCX, which is nearly the same display in a smaller size. Let's check 'em out.

Asus ProArt PA32UCX, apparently. That's it, that's the only pic I have. Sorry.

Both of these displays are LED-backlit LCD monitors in 3840×2160 resolution. They top out at a 60-Hz refresh rate, but that's not a big deal for a display aimed at graphic designers and photo editors. No, those folks will likely not care about that at all, and instead be totally stoked at these true 10-bit displays' ability to reproduce 97% of the DCI-P3 color space and a full 89% of the ridiculously-wide Rec.2020 color space. They support both HDR10 and Hybrid Log-Gamma HDR profiles, and include color profiles for Adobe RGB, Rec. 709, DCI-P3, and Rec. 2020.

There are reasons for gamers and movie nerds to be into these displays too, though. Their peak brightness tops out at a staggering 1200 cd/m². For context, your typical desktop display peters out around 300. Asus isn't giving hard numbers on contrast ratios yet, but does say that these monitors meet VESA's DisplayHDR 1000 requirements. That specification mandates at least 1000 cd/m² peak brightness and just 0.05 cd/m² corner brightness. Savvy gerbils will realize that such specs require local dimming, and indeed, the 32" PA32UCX has a full one-thousand local dimming zones. The smaller PA27UCX has 576 zones.

Asus ProArt PA27UCX. Note the slightly thicker bezels.

To have so many local dimming zones in relatively small screens, these displays use "Mini LED" backlights. That's not to be confused with the forthcoming Micro LED technology that purports to compete with OLED for contrast and brightness. Instead, Mini LEDs are sort of a half-measure toward that technology. While Micro LEDs will use a single LED for each pixel, Mini LEDs cover 8×8 or 4×4 grids—still quite small, but much cheaper to manufacture than Micro LEDs. Trendforce's LED Inside claims that Mini LED production is around 20% more expensive than typical LCD monitors, while Micro LED models will be "more than 3 times" as expensive.

The screens themselves aren't the only interesting qualities of these displays, if you can believe that. They also accept video input USB Type-C using DisplayPort Alternate Mode. The larger of the two, the PA32UCX, supports Thunderbolt 3, and even has a pass-through port. It'll also accept HDMI 2.0 or regular old DisplayPort connections. Meanwhile, the PA27UCX can run entirely off a single USB Type-C connection, as long as your PC supports USB-PD up to 60W. It'll also include DisplayPort and HDMI connectors too, though, as well as a USB 3.0 hub.

If you're after some of that Mini LED 4K UHD HDR USB acronym salad, you won't have long to wait. Asus said back at CES that the PA32UCX should hit "this Spring," while the PA27UCX should arrive any day now. The company hasn't announced pricing, but with specs like these, we might go ahead and take out that second mortgage.

Comments closed
    • danny e.
    • 7 months ago

    Once you go 32″, you’ll never go back to anything else.

    • psuedonymous
    • 7 months ago

    Cutting through the marketing: It’s a FALD backlight with 1.5x higher zone density than current monitors. For example, the PG27UQ has 384 FALD zones to the PA27UCX’s 576 zones, or a 24*16 array vs. a 32*18 array.

    • DarkUltra
    • 7 months ago

    [quote<]They top out at a 60-Hz refresh rate, but that's not a big deal for a display aimed at graphic designers and photo editors. No, those folks will likely not care about that at all, and instead be totally stoked at these true 10-bit displays' ability to reproduce 97% of the DCI-P3 color space and a full 89% of the ridiculously-wide Rec.2020 color space.[/quote<] 120Hz makes your mouse more responsive and precise, and 24fps footage can run smooth on a 120Hz monitor.

    • meerkt
    • 7 months ago

    If 576 and 1000 (1024?) zones, at 3840×2160 it’s not 8×8 pixels per LED but 120×120 and 90×90.

    • Johnny Rotten
    • 7 months ago

    The questions will be…

    1) how much heat does 100 mini-LEDs give off and will the panel require active cooling (no go full stop)

    2) when will there be 100hz+ displays

    • Chrispy_
    • 8 months ago

    [quote<]1200 cd/m²[/quote<] On a [i<]monitor[/i<]? Goodbye, sweet retinas. It was nice knowing you! Meanwhile, I set my white point to 150 cd/m²at work and though I haven't calibrated my current monitor at home, It's set dimmer than my old monitor which was calibrated to 120 cd/m² (better black level means that it doesn't need to be as bright for good contrast). I would estimate that it's set around 60-90 cd/m² My HDR TV is ridiculously bright in full-HDR movie mode. There's absolutely never a need for that level of retinal assault even for a 100-minute movie, let alone as a desktop monitor for longer sessions.

      • meerkt
      • 7 months ago

      For specific highlights in movies or games, I suppose.

        • DarkUltra
        • 7 months ago

        Yes. A starry night glitters like diamonds on a HDR display.

        [url<]https://youtu.be/jL0sQull5Gg[/url<]

          • GrimDanfango
          • 7 months ago

          Yeah, I get the feeling most people don’t even understand the principle behind HDR.

          Certainly, greater than about 120cd/m^2 as a standard desktop brightness will mess with your eyes – I ran mine higher for a long time, and found I started needing brighter and brighter bulbs in my house to compensate for my lessening eye sensitivity. Thankfully, calibrating all my screens to 120 (and getting a decent pair or sunglasses for when the sun’s out!) has slowly reversed the effect.

          But the point of High Dynamic Range isn’t to blast your eyes with an average of 1000cd/m^2…
          …the point is to increase dynamic range (surprising, I know!)
          In simple terms, real light doesn’t work on a scale of 0.0-1.0, and there is no such thing as “100% white”.
          Real light works on an ever-increasing linear scale of brightness, but your eyes have a naturally non-linear response to increasing light levels, so you naturally perceive higher detail in dark colours and compress brighter shades. Normal monitors lack both sufficient brightness *and* sufficient bit-depth to emulate the effect to more than the equivalent of a few camera-f-stops (a tiny fraction of what human vision can perceive)… HDR monitors ideally facilitate a bare-minimum specification to emulate something of the effect you see when you go outside.
          …so yes, the maximum brightness is high – if you had a sufficiently powerful HDR display in the future, you probably shouldn’t look at a correctly-exposed image of the sun for too long 😛

          I guess the interesting thing could be that realistically rendered HDR games set in sunny climates will someday probably need to emulate sunglasses by stopping down their final output by ~3 f-stops 🙂

          • Chrispy_
          • 7 months ago

          So I’ve just watched that clip on my HDTV with HDR and without HDR and honestly, HDR looks far worse to me. Perhaps I don’t have the right kind of TV since it’s QLED and not OLED, so even 400+ zones of local dimming simply don’t work for point lights like those stars.

          The glittering is an artifact of inadequate FALD resolution, not an accurate portrayal nor something I want to see.

            • GrimDanfango
            • 7 months ago

            Agreed, they’re jumping the gun a bit. HDR is little more than a gimmick to shift TVs at this point, *until* we start getting per-pixel-HDR.
            I’m slightly concerned that they’ll kill of interest in it by the time it’s actually a fully viable technology, because the real deal should look stunning!

    • jihadjoe
    • 8 months ago

    Honestly I’m more interested in another monitor. ASUS also announced the PQ22UC at CES and it seems to be our first actual production OLED monitor. 4k at 22″ does seem a bit small, but it’s an integer multiple of 1080P and thus probably works better. Maybe they’ll do a 5k 27″ at a later date.

    • Billstevens
    • 8 months ago

    Kind of feel like I want to hold out on a new monitor until we see some based on Samsungs microled tech.

    This seems to have all the real updates I want. OLED performance without the downsides. Truely borderless displays. And its not based on a back light from what I can tell.

    Hopefully in another 4-6 years that tech trickles into normal TVs and not just their display walls.

    • DragonDaddyBear
    • 8 months ago

    There is a lot going on in the monitor space. I would love to upgrade but I don’t think the market has reached a maturity yet. I’m regretting buying a TV last year before HDMI 2.1 (eARC and VRR) became standard.

    • willmore
    • 8 months ago

    [quote<]For context, your typical desktop display peters out around 300.[/quote<] And here I am with the brightness of my normal everyday monitor set to the absolute minimum and it's still too bright for comfort. I can't imagine wanting it over 4x brighter. Maybe they're using them for home radial keratotomy?

      • hkuspc40
      • 8 months ago

      Seriously. I have an Asus Eye Care monitor (PB277Q) and it’s set at 25/100 brightness. It gives me a major headache if I have it all the way up.

        • derFunkenstein
        • 8 months ago

        The brightness control probably isn’t perfectly linear and almost certainly doesn’t start from perfectly dark. So it’s hard to tell how much light it’s emitting. Also the specs say that your monitor puts out 350 cd/m^2.

        [url<]https://www.asus.com/us/Monitors/PB277Q/specifications/[/url<]

          • Srsly_Bro
          • 8 months ago

          My old monitor was an Acer P243w and it’s rated for 400 cd/m^2. The screen could light up a small room. The power consumption was quite high.

          • willmore
          • 7 months ago

          Yeah, that’s the thing that kills me about LED backlights. LEDs are perfectly linear in brightness vs control signal when PWM modulated (not quite perfect when used with current control, but they don’t do that as the color point changes, too). They’re not like CFL tubes where they had a minimum brightness.

          So, why do they limit the brightness range so severely?

      • K-L-Waster
      • 8 months ago

      Agreed. I just bought a pair of new monitors, and in the process had to return one and get a different model — so as a result, I’ve had 3 new monitors on my desk in the past few weeks. All three of them had the out-of-the-box brightness high enough that I got eye strain from looking at them until I calibrated them. The calibrations all resulted in dropping the brightness to ~35%.

      • DPete27
      • 8 months ago

      mmm, my C27HG70 tops out at 600 nits.
      Definitely lacking in the local dimming department though. I guess I need a new monitor now…

      • Wirko
      • 8 months ago

      You’ll need darker sunglasses than you thought you would.

        • willmore
        • 7 months ago

        You know how strange of a look my optometrist will give me when I ask for ‘reading sun glasses’?

    • Usacomp2k3
    • 8 months ago

    That looks pretty amazing.

    • Tristan
    • 8 months ago

    only 576 leds for professional monitors aimed for photographers ? ridiculous
    There will be models with 10000 leds, and for gaming
    [url<]https://www.digitimes.com/news/a20190403PD212.html[/url<]

      • jihadjoe
      • 8 months ago

      640 LEDs ought to be enough

        • Srsly_Bro
        • 8 months ago

        For a flash drive, maybe.

          • jihadjoe
          • 7 months ago

          “flash” drive lel

        • rnalsation
        • 8 months ago

        Only 640 LED’s?! highLED.sys ought to put a stop to that.

        • UberGerbil
        • 8 months ago

        640K (LEDs) is not enough (for a single DIMM of) RAM these days.
        I wish I was joking.

Pin It on Pinterest

Share This