Nvidia G-Sync HDR is on the way in high-end displays

Last night during its CES keynote presentation, Nvidia didn't talk up what it's ultimately one of its coolest new products, revealed via blog post at GeForce.com: G-Sync HDR, present in new 4K HDR monitors with Quantum Dot technology and 144Hz refresh rates. These might be the most capable, powerful displays we've yet seen, and we're drooling at the prospect of gaming on one of them.

Nvidia brought G-Sync technology to PC displays a few years ago, allowing for variable refresh rates and helping to reduce screen tearing and input lag. Now, the graphics giant is looking to bring some of the biggest selling points of home theater systems to PC gaming. The company has partnered with AU Optronics to create HDR G-Sync displays. Both Asus and Acer will be bringing the first batch of those displays to market later this year.

According to Nvidia, G-Sync HDR displays will support the HDR10 color standard and get close to covering the DCI-P3 color gamut used in digital cinema. It's still hard to sell HDR by showing it on regular displays, but Nvidia's side-by-side comparison image does a fair job of simulating the difference is between SDR and HDR color. The new displays should also be capable of 1,000 nits of eyeball-frying brightness, and feature 384 individually-controlled backlight zones.

These screens are also coated with Quantum Dot Enhancement Film (QDEF). Nvidia says the QDEF "is coated with nano-sized dots that emit light of a very specific color depending on the size of the dot, producing bright, saturated and vibrant colors through the whole spectrum, from deep greens and reds to intense blues."

The features listed above are fairly common to recent HDR-capable UltraHD televisions, but they tend to introduce input lag, something that can make gaming on them a less-than-ideal prospect. As a data point, some LG OLED displays have both high and inconsistent input lag, meaning that there could be anywhere from 30ms to 70ms of delay. Nvidia claims that G-Sync HDR displays, on the other hand, have been built from the ground up for gamers. They incorporate the picture-enhancing elements of those televisions while retaining stutter and tearing-free experience that we expect from G-Sync.

Of course, we have to mention that G-Sync HDR was announced shortly after AMD unveiled FreeSync 2, a standard aimed at doing many of the same things. As PC World notes, FreeSync 2 is an attempt from AMD at establishing an industry standard, while G-Sync HDR is a product that manufacturers will work with Nvidia to integrate into their displays. FreeSync 2 has the potential to spread more easily, but Nvidia's control over G-Sync may result in a a more consistent experience once the hardware sits on our desks. Time will tell.

The displays coming from Asus and Acer are expected to hit later this year, but neither manufacturer has provided a price. Even as the displays themselves get easier on our eyes, expect the prices to continue to be more eye-watering than ever.

Comments closed
    • floodo1
    • 3 years ago

    Will be interesting if G-Sync HDR continues the situation where G-Sync offers an overclocking mode for the panel allow G-sync monitors to be offered with higher refresh rates than the Freesync version of the same monitor (i.e. Acer Predator X34 at 100hz vs Acer XR341CK at 75hz).

    Also if the trend continues where Freesync monitors typically have smaller adaptive sync ranges (usually they a higher minimum refresh rate).

    Finally, will G-sync ever be price competitive? Likely answer: HELL NO )-8

      • Klimax
      • 3 years ago

      For last question: FPGA versus ASIC. We don’t know yet what it will be this time.

    • floodo1
    • 3 years ago

    That monitor is good stuff for the multi-gpu crowd! Single GPUs still can’t push 8 million pixels (4k) per frame in demanding games, and definitely not anywhere near 144fps though.

    If you look at GTX1080 performance at high resolutions (> 4 megapixel, i.e. 4k, 5k, some ultrawides) it becomes obvious that the push for 4k has really come from the “TV & Film” industries. Also, Japan has been moving toward 8k for television content for a long time now and HDR is becoming mainstream for TVs already.

    Hopefully GPUs can catch up a little bit faster and we can all end up with 8k “HDR” @ 144hz in >30in sizes for <$500 … not going to hold my breath though (-8

    • Sargent Duck
    • 3 years ago

    I’ll be impressed when it’s free.

    Oh, wait a minute…

    • ozzuneoj
    • 3 years ago

    Let me know when they can eliminate sample and hold blur artifacts while using all these fancy features on super expensive gaming monitors. Till then, an old CRT still has at least one major benefit over these, which is kind of lame…

    • psuedonymous
    • 3 years ago

    [quote<]As PC World notes, FreeSync 2 is an attempt from AMD at establishing an industry standard,[/quote<] And [url=http://www.anandtech.com/show/10967/amd-announces-freesync-2-improving-ease-lowering-latency-of-hdr-gaming<]as Anadtech notes[/url<], Freesync 2 is AMD's move towards a G-sync like product line. Freesync 2 requires certification by AMD, minimum performance levels (something that Freesync lacking resulted in lots of pretty poor monitors), and refusing to rule out fees to use. In addition, Freesync 2 differs from G-Sync in that Freesync 2 [b<]requires games to implement AMD-specific APIs to work with HDR[/b<].

      • Klimax
      • 3 years ago

      Re bolded part: That’s dumb. They don’t have market share for that. Almost nobody wants to spend more time on vendor-specific API either.

    • TwoEars
    • 3 years ago

    Give me a 27-inch 1440p version and I’m buying.

      • Rectal Prolapse
      • 3 years ago

      And ULMB!~

      • Voldenuit
      • 3 years ago

      I mean, there are already two 27″ G-sync ROG monitors. Adding HDR10 to a sub-4K display might be a bit of a waste, as much of the draw is presumably being able to display HDR UHD content.

    • AnotherReader
    • 3 years ago

    Impressive! I assume that these would require two displayport 1.3 cables to drive them. At CEATEC 2016, Sharp demonstrated a [url=http://www.anandtech.com/show/10732/ceatec-2016-sharp-showcases-27-inch-8k-120hz-igzo-monitor-with-hdr-also-1000-ppi-for-vr<]27-inch 8K 120Hz IGZO Monitor with HDR[/url<] and that required 8 displayport 1.2 cables to drive it.

      • psuedonymous
      • 3 years ago

      [quote<]displayport 1.2[/quote<] Pascal introduced DP 1.3 outputs, and with a future update promised to DP 1.4 (adding DSC). Might rule out using a single DP link with Maxwell and earlier, though MST monitors were hardly unheard of when UHD displays for consumers started becoming available.

    • tipoo
    • 3 years ago

    With Freesync 2 eliminating a lot of the guesswork and raising the bar of what monitor makers can get away with, frig, I really wish Nvidia would support both.

      • odizzido
      • 3 years ago

      They can’t see your wishes because they’re covered in money.

      • nanoflower
      • 3 years ago

      They probably will but it will take Intel adding support for Freesync 2 to their products first. Once that happens then it’s just a matter of time before Nvidia caves in and supports Freesync.

      • EndlessWaves
      • 3 years ago

      With these laudably high standards nVidia are requiring for the G-sync HDR label we may end up with the same situation again.

      1000 nits of brightness and 384 dimming zones is 2,000+ television territory. It’s not full HDR support, but it’s a lot better than most televisions are providing. Although I’m slightly surprised as the small colour gamut, Adobe RGB/DCI P3 has been available for years and I’d have expected them to push for wider.

      Still, it’s not necessarily a bad thing. A bugger for anyone wanting a mainstream nVidia card for other reasons but it means nVidia offers a proper premium experience and, for the rest of us, gets the technology into FS2 monitors faster than it would if it were left to monitor manufacturers.

        • XTF
        • 3 years ago

        With HDMI 2.1 supporting VRR isn’t G-Sync finally dead?

      • Klimax
      • 3 years ago

      Massively Doubtful. A lot of FS2 is under AMD control. And some things are most likely critical deal breakers. (like AMD’s custom API)

      • CoD511
      • 3 years ago

      Both would be nice… but I suppose it doesn’t benefit Nvidia and its board of directors wouldn’t like it regardless. Well, they’re actually investing at least which does justify it somewhat.

Pin It on Pinterest

Share This