Samsung announces the first FreeSync 2 displays

The ranks of gaming monitors with quantum-dot-enhanced LED backlighting has grown a bit today with Samsung's CHG70 and CHG90 displays. The new monitors are the first FreeSync 2-compliant displays, meaning that they support HDR, wide color gamuts, and the ability to communicate those characteristics to the host display adapter. All of these screens use VA panels with 1800R curvatures.

Samsung will offer up CHG70 displays in 27" and 31.5" sizes, both sporting 2560×1440 resolutions.  The big-daddy 49" CHG90 has a 3840×1080 32:9 aspect ratio that's similar to a pair of 1920×1080 monitors next to each other, minus the pesky bezels in the middle. All three models have the same 144 Hz refresh rate. Both CHG70 models have one DisplayPort and a pair of HDMI inputs, while the CHG90 adds a mini-DisplayPort to that mix.

Response time is touted at 1 ms (MPRT), with 178° viewing angles, 600 cd/m² brightness, and support for over a billion colors all around. Contrast ratios were not provided.  All three models include built-in USB 3.0 hubs, audio jacks, and height-adjustable stands. Additional adjustments include tilt and swivel, and the smaller, lighter CHG70 models can also pivot into portrait mode. As for that HDR support, Samsung claims 125% coverage of the sRGB color space and 95% of DCI-P3.

FreeSync 2 requires support for HDR, wide color gamuts, and the Low Framerate Compensation technology that was left optional in the original FreeSync specification. FreeSync 2 certification is a much higher bar to cross than the original version, so FreeSync and FreeSync 2 displays will coexist for the foreseeable future. The required support for Low Framerate compensation alone makes FreeSync 2 certification a noteworthy feature for owners of compatible AMD graphics cards.

Samsung did not provide pricing or availability information for the CHG70 and CHG90 displays, though the HDR support, quantum LED panel, and high refresh rates all suggest premium pricing.

Comments closed
    • JustAnEngineer
    • 3 years ago

    Here’s a quick review of the C32HG70:
    [url<]https://www.gamecrate.com/samsung-chg70-32-inch-hdr-gaming-monitor-overview/16348[/url<]

    • alrey
    • 3 years ago

    Screens are actually curved during the CRT days but in the opposite direction.

    • Starfalcon
    • 3 years ago

    The 31.5 version is available exclusively at newegg for $699, and the 27 is available only through the samsung website for $599.

    • RdVi
    • 3 years ago

    [quote<]All of these screens use VA panels[/quote<] Yay! [quote<]...with 1800R curvatures.[/quote<] Boo! I'm ready to buy a ~27" 2560x1440 flat VA monitor with FS2 and >100hz refresh. Someone just needs to bring one out.

      • DPete27
      • 3 years ago

      I completely agree. That 27″ model looks like the silver bullet to me….except the curved screen….

      • BobbinThreadbare
      • 3 years ago

      I think your main problem is the lake of FreeSync 2 screens.

      With FreeSync 1 I believe this Asus fits the bill [url<]https://www.newegg.com/Product/Product.aspx?Item=N82E16824236466[/url<]

    • ozzuneoj
    • 3 years ago

    So do they measure response time differently than most other manufacturers? This is the first time I’ve seen a non-TN LCD screen advertised as having a 1ms response time. This seems like a pretty big deal to me.

    Its a shame these displays don’t have some kind of blur reduction scanning backlight mode though, as even with an instantaneous response time there will still be sample-and-hold image persistence artifacts. At least with Gsync monitors you get ULMB, and some hacks have shown that it actually works well with Gsync on certain displays (despite the common belief that it shouldn’t).

    People who haven’t experienced ULMB, Lightboost, BenQ Blur Reduction or other similar modes (or used a CRT recently) don’t know what they’re missing. I would have a hard time justifying the purchase of an expensive monitor that cannot provide a clear image during motion. At 144Hz and a solid 144 fps there is a slight improvement in motion clarity over a lower refresh rate monitor, but blur reduction modes provide a significantly clearer image regardless of frame rate.

    [url<]https://www.blurbusters.com/faq/60vs120vslb/[/url<]

      • RAGEPRO
      • 3 years ago

      Heh, it’s funny you say that.

      Samsung reports MPRT, which is “moving picture response time”. Those figures come from having the impulsive scanning strobe backlight turned on. You won’t see a 1ms response time without MPRT enabled; more like 4ms.

        • stefem
        • 3 years ago

        Funny 🙂 but backlight strobing does not affect pixel response time, it just trick your visual system

          • RAGEPRO
          • 3 years ago

          In the end it has the same effect. You don’t see the pixel transitions, which is the point.

      • Pettytheft
      • 3 years ago

      Every monitor post there is someone out there that mentions every monitor buzzword stating that people don’t know what they are missing. I think it’s some sort of personal justification for the extra hundreds of dollars that you drop on hardware features that most people are not sensitive to or care about. Yes I’ve seen all the fancy features in action and I could tell the difference between 60 and 90fps but it’s diminishing returns compared to 30-60. Gsync/Freesync does nothing for me. I always make sure settings are able to handle at least a solid 60fps.

      It wasn’t long ago that gamers were happy with a IPS panel that had a decent response time. You know what I want? A nice glossy panel. Gimme that shine and colors that pop. I still have my Crossover monitor and the colors are still noticeably better than my current 144hz, matte, IPS with fancy game mode to reduce blur. I sit in a room with blinds to block the glare and most of my gaming is at night. I feel like we are being sold on all these new features just to continue the purchase cycle. Monitors and faster SSD’s are the biggest hype right now.

        • travbrad
        • 3 years ago

        I agree with you about Gsync/Freesync. They make terrible framerates look slightly less terrible but you are way better off just getting more FPS in the first place. I’m not really impressed with ULMB either. I really just can’t see the difference or if I can it’s borderline placebo.

        On high refresh rates I slightly disagree though. There are definitely diminishing returns but for me that starts more around 100hz/FPS. The difference between 60 and 90 is still massive. For me it’s one of those things I notice more going back to 60FPS after experiencing the silky smoothness of high refresh rates. There are a few games effectively locked at 60FPS and they are just noticeably unsmooth now. Nowhere near “unplayable” but still not ideal.

        I got a monitor with Gsync/ULMB out of curiosity since I was already spending hundreds on a 27″ 144hz monitor anyway but in retrospect Gsync was a waste of money. I just can’t understand the level of hype adaptive refresh got after experiencing it first hand. I understand why Nvidia and monitor makers hyped it up, but nearly every “independent” tech journalist/reviewer hyped it up too for seemingly no reason. It’s a feature marketed at high end gamers but it mostly benefits slow PCs, and even then the improvement is limited.

          • BobbinThreadbare
          • 3 years ago

          The *Sync technologies mean no lag added with no screen tearing this is useful beyond what you can consciously notice, and frankly it’s often cheaper than more FPS and certainly more future proof than spending more money on a GPU.

        • ozzuneoj
        • 3 years ago

        If it was a justification for buying an expensive monitor on my part, I wouldn’t still be clinging to and recommending that people not throw away their CRTs. My non-Gsync Benq XL2720Z is the only monitor I have used that I prefer over my old HP P1230 CRT, and I admit that there are still trade-offs (color quality, pixel density, resolution flexibility… all better on the CRT… the LCD is larger, lighter and sharper though).

        There are thousands of gamers out there that were born or started gaming during the LCD era and have never even seen the difference in motion clarity on a CRT, so they will have little reason to splurge on a modern monitor that tries to recreate that clarity. With other buzz-word features being more prominent (and not able to be used with ULMB, blur reduction etc.) even those that buy gaming monitors generally don’t get to see the difference.

        You don’t even have to spend tons of money to get these features. My monitor was just over $200 for a refurb last year. A 24″ monitor with BenQ Blur Reduction will give similar performance (not identical) for even less money in many cases.

        I personally am not all that interested in Gsync. I prefer to simply play games at higher frame rates, at which point variable refresh rate tech no longer makes much of a difference.

        As for gamers being happy with IPS panels with decent response times… there was a time when gamers\computer users were happy with low resolution CRT monitors and televisions, and the instantaneous input and response times coupled with perfect motion clarity were things that were taken for granted and never advertised. Then the computer industry decided to start pushing cheaper to manufacture LCDs and made people desire slimmer, lighter, more high-tech looking “flat-screens” that were “HIGH DEF” and “WIDE SCREEN” and the buzz-words started to take over. IPS has turned into a buzz word, as has OLED, 4K, HDR etc… and regardless of how many of these features they check off the list, an incredibly small percentage of modern displays can match an ancient CRT in the areas that were just taken for granted back in the day.

        I agree with you that buzz words are stuffed down the throats of consumers, as they always have been, but when people blow scads of cash on computers and gaming consoles and their entire lives revolve around these kinds of things, its amazing how few realize that all the super-uber-8K-HDR graphics in the world aren’t going to make any difference if there is 8 to 16 pixels of image persistence when the image moves, on top of the poor response time and possible input lag. If you genuinely want to see what I mean, just try using a CRT once for modern gaming. It will be smaller and the image will be somewhat softer, but look at the clarity when you move. Human eyes do some funny things that make use see blurring. CRTs and some specific LCD\OLED screens can trick the eye into not doing these funny things. That’s what it comes down to.

          • travbrad
          • 3 years ago

          [quote<]There are thousands of gamers out there that were born or started gaming during the LCD era and have never even seen the difference in motion clarity on a CRT, so they will have little reason to splurge on a modern monitor that tries to recreate that clarity. With other buzz-word features being more prominent (and not able to be used with ULMB, blur reduction etc.) even those that buy gaming monitors generally don't get to see the difference.[/quote<] I started gaming on CRT monitors with Wolfenstein 3D and Doom, then eventually Quake/II/III, Half-Life/CS, etc. ULMB still does nothing for me compared to just regular/Gsynced 120/144hz. I realize there is a demonstrable difference when you take still photographs of it, but I guess I'm just incapable of seeing it in real time. It sounds great in theory which is why I bought a monitor with ULMB/Gsync in the first place, but no amount of theory will make my eyes/brain capable of seeing it. For what it's worth every 60hz LCD I've seen has noticeable motion blur though, ranging from horrible to "okay" on displays with good overdrive. Even with the earlier/60hz displays the difference in input lag was always far more noticeable than the motion blur to me. My 144hz monitor is the first LCD I've had where there is so little input lag that I truly can't feel the difference anymore. All my past (and current secondary) 60hz monitors were noticeably worse, and my big LCD TVs are even worse than those. Even on the 144hz monitor if I hit 144FPS and turn on Gsync/Vsync the input lag becomes noticeable again, so above 144FPS I just have Gsync disable itself. Apparently people are different. You're sensitive to motion blur. I'm sensitive to input lag. Some people are really sensitive to tearing. Some are sensitive to color shifts, backlight bleed/IPS glow, or some combination of all of those things. It's not just because we haven't experienced the glory of CRTs.

        • Kretschmer
        • 3 years ago

        ULMB is huge. Refresh rates at 100Hz and above are huge. *Sync is nice when you can’t cap framerate.

    • Billstevens
    • 3 years ago

    I can’t say I am a fan of those resolutions. If I were going to pay the money for the big daddy I wouldn’t wan’t the equivalent of dual 1080p wide screens. I would pony up for the 5120×1440 version. Seems slightly odd they would forgo a model like that.

    I feel like the bump to 1440 vert as a baseline for PC really shows off the value of high end gaming power over say traditional PS4 gaming.

      • Voldenuit
      • 3 years ago

      Yeah, I ‘d prefer a 21:9 3440×1440 similar to the upcoming Asus PG358VQ or Acer Z35P over the 32:9 Sammy.

      • DPete27
      • 3 years ago

      According to my calculation, it’s the equivalent of two 27″ 1920×1080 screens side by side.

        • Bauxite
        • 3 years ago

        Or the top/bottom half of a 55″ 4k, for $1500.

        You can get a pretty damn good 55″ 4k for $3000, and I’d argue that a pure monitor board might actually be a simpler design (accept input as-is and don’t mess with it vs all the junk TVs do).

    • sparkman
    • 3 years ago

    Obligatory when-is-nVidia-gonna-support-Freesync comment?

    Seriously, I want them to.

    • CScottG
    • 3 years ago

    144Hz on a 49″ panel – WOW!

      • DancinJack
      • 3 years ago

      ?

        • Firestarter
        • 3 years ago

        what other monitor of that size supports 144hz input? I think it’s pretty special

          • Demetri
          • 3 years ago

          1080 vertical res though, so it’s basically like having two 1080P panels next to each other, just in 1 monitor.

            • Firestarter
            • 3 years ago

            which is still pretty cool. I wonder if it’ll support presenting itself as 2 monitors, that might make sense when games don’t support the wonky screen resolution

            • CScottG
            • 3 years ago

            True, but it’s like having two *LARGE* 1080P panels side-by-side without two bezels in the middle that will each do 144Hz.

            ..and I’m pretty sure I haven’t seen that before.

            ..though someone is always going to whine about the dot-pitch at typical monitor distance, despite the fact the size offers a level of immersion that lets you forget about that completely in virtually all games. Hell, I remember CRT monitors with glowing RGB stripes – which didn’t bother me at all after about a minute of game-play.

    • EndlessWaves
    • 3 years ago

    That CHG90 looks like a nice panel. I guess it doesn’t have an actual high dynamic range as there’s no mention of local dimming but if Freesync 2 means wide colour gamuts are actually usable without distorting colours in most programs it could be worth having.

    It’s a shame that it’s not HiDPI, I’m guessing the LFC requirement is the limiting factor for 7680×2160 over DP 1.4 as I think the lowest we’ve seen is 30-75hz and not 25-60hz.

      • JustAnEngineer
      • 3 years ago

      Won’t it be helpful when monitor manufacturers start supporting FreeSync down to 23.976 Hz and up to 60+ Hz with LFC as a standard offering? Although, given LFC’s frame-doubling prowess, a low-end frequency of 47.95 Hz would be fine as long as the upper limit was high enough to be useful for LFC.

      • cynan
      • 3 years ago

      What is the benefit of HiDPI for most applications again? Sure, if ultra high resolution panels cost the same or only a little more, then by all means. But otherwise, aren’t there better things for most enthusiasts to spend their PC dollars on?

      Personally, I think somewhere around 110 DPI is HiDPI enough (ie, a bit higher pixel density than a 27″ 2560×1440 or about the same as a 3840×1600 37.5″ ultrawide.

    • Demetri
    • 3 years ago

    Leaning toward this over the Nixeus EDG simply for the Freesync2 support. I’m pretty sure I won’t like the curve, but I’ll at least try it out to get a taste of some HDR.

      • kloreep
      • 3 years ago

      Same boat, I like the sound of everything about the 2560×1440 models except for the darn curve.

        • SoM
        • 3 years ago

        soak it in water and press it flat overnight.

        make sure to unplug first :p

      • Helmore
      • 3 years ago

      I don’t actually get why they’re releasing so many curved monitors. Are they really selling that well?

        • Voldenuit
        • 3 years ago

        Could be the ‘3DTV’ craze all over again.

        Curved monitors command a price premium. Monitor makers fall over each other trying to grab a piece of the pie.

        End result, we end up with a glut of curved monitors even if it’s not what consumers actually want.

          • Sargent Duck
          • 3 years ago

          I do see some of the logic behind curved monitors, at a certain point a monitor will get too wide so by curving it you can get more monitor into your peripherals.

          I only have a 24″ so I don’t really care, but I wouldn’t care if my monitor was curved or not. Just as long as it ticked off all the other boxes.

            • Bauxite
            • 3 years ago

            Curves actually make [b<]way[/b<] more sense for monitors compared to TVs. Single user who will almost always be centered and within a relatively limited range of desktop viewing distance VS multiple viewers often at oblique angles and a broad spread of distances. With the former you can take advantage of all the benefits of sitting in the sweet spot: more direct viewing angles and similar distance to eyes from every pixel. For PC use there are really no downsides other than form factor/footprint and mounting, and they can actually be better than flat for some setups like corner desks. There is no distortion or geometry problem with a sane pixel array either. (hell, you couldn't avoid geometry on a CRT no matter what)

            • SoM
            • 3 years ago

            i believe the main thing was how game support was ? how is it now anyways?

            i would’ve bought a curved few months back but reading about gaming on curved turned me away.

            i like curvy things

            • JustAnEngineer
            • 3 years ago

            How would your game know how to correct for field curvature differently for a flat monitor vs. a curved one?

            • Bauxite
            • 3 years ago

            You don’t, there really isn’t distortion with these very large radii. CRT had different problems that amplified anything not perfect, which was impossible with the designs, even a “flat” CRT was way off.

          • Shobai
          • 3 years ago

          If this was a news article about the latest mobile phone with a non-removable battery, people would be falling over themselves to tell you you’re wrong: the manufacturers only make what people want to buy…

      • NoOne ButMe
      • 3 years ago

      I would stay away from LED for HDR until it’s around 1K nits.
      OLED around 500 is a good starting point also.

      The waiting game continues. I have about 1K sitting aside for a 75Hz+ freesync monitor meeting either of those brightness standards.

      Be closer to 2K by the end of the year. Hopefully by that point……..

      • Kretschmer
      • 3 years ago

      Even at 34″ ultrawide, I prefer no curve. Sadly, I am not a monitor maker.

    • Bauxite
    • 3 years ago

    Gimmie, their TV VA panels are awesome, using 48/65 already as main displays. (if only the input was better…) Considered waiting for the 4k asus/others later but going to be around twice the price and even the next GPUs won’t be enough to push >100hz 4k for a lot of stuff.

      • Voldenuit
      • 3 years ago

      [quote<] even the next GPUs won't be enough to push >100hz 4k for a lot of stuff.[/quote<] I'm hoping that spares/checkerboard rendering will become the norm for high resolution game rendering, because brute force solutions are wasteful and inefficient. The Forza team can get 4Kp60 out of a RX 580-class GPU; it's a win-win for everyone.

        • Bauxite
        • 3 years ago

        4k is just plain big, and the only “real” cheat that will work as we keep going up is foveated rendering. Needs new display tech to both track your focus and adjust the pixels or whatever extremely fast, at least an order of magnitude above today’s stuff. Once they can do that I expect it to be used for VR anyways.

        4k raw dumb pixel path:
        9x 720p
        4x 1080p
        2.25x 1440p

        In simple terms that means if a gpu can do consistent 4k@60hz (which the Titan Xp kinda can) then it is also good enough for 1440p@135hz. Might need to bump the cpu as well, but the reality of more punishing scaling at higher rez also means it should be a smooth 144hz. I expect the next gen to be even more consistent (and cheaper $/perf) for 4k@60hz, but not a giant leap to 120 or above. Also SLI/Xfire are generally a giant crapshoot on so many levels, no thanks.

Pin It on Pinterest

Share This