Personal computing discussed

Moderators: renee, Dposcorp, SpotTheCat

 
apkellogg
Gerbil Elite
Topic Author
Posts: 962
Joined: Wed Feb 25, 2004 10:15 am

True 10-bit vs 8-bit + FRC?

Fri Apr 27, 2018 10:09 am

I'm currently looking for a new monitor for my computer so that I can pass off an older one to my son. I would like to get a monitor that will last me a number of years, so I was thinking a 32", 10-bit, HDR, 4K monitor. However, it appears as though almost every monitor is 8-bit + FRC. Would I really notice the difference if I go the 8 bit route vs 10-bit? I have been looking at a couple of Samsung monitors, Link 1 or Link 2.

If it makes a difference, the computer is used for normal day to day office work, plus the occasional game or movie. Thank you for any advice!
 
bthylafh
Maximum Gerbil
Posts: 4320
Joined: Mon Dec 29, 2003 11:55 pm
Location: Southwest Missouri, USA

Re: True 10-bit vs 8-bit + FRC?

Fri Apr 27, 2018 10:21 am

If you're not doing graphic design work, why do you think you need a 10-bit panel?
Hakkaa päälle!
i7-8700K|Asus Z-370 Pro|32GB DDR4|Asus Radeon RX-580|Samsung 960 EVO 1TB|1988 Model M||Logitech MX 518 & F310|Samsung C24FG70|Dell 2209WA|ATH-M50x
 
apkellogg
Gerbil Elite
Topic Author
Posts: 962
Joined: Wed Feb 25, 2004 10:15 am

Re: True 10-bit vs 8-bit + FRC?

Fri Apr 27, 2018 10:27 am

bthylafh wrote:
If you're not doing graphic design work, why do you think you need a 10-bit panel?

I guess its just the thought that I have had my current main monitor for over 10 years so want to make sure I get as much of the current tech as possible so that the next one lasts as long.
 
DPete27
Grand Gerbil Poohbah
Posts: 3776
Joined: Wed Jan 26, 2011 12:50 pm
Location: Wisconsin, USA

Re: True 10-bit vs 8-bit + FRC?

Fri Apr 27, 2018 10:36 am

I doubt you'd notice the difference between 8 and 10 bit unless you're an elite photo editor. Even then, the cost premium is unlikely to be justified (given my impression of your usage).

I would steer toward a 2560x1440 resolution if you want to play modern games. That puts you in the RX580-Vega56 or GTX1060 6GB/GTX1070 territory for GPUs instead of GTX1080Ti for 4k. A 32" 2560x1440 monitor also has the same pixel density as a 24" 1920x1080 monitor (91ppi, which is the benchmark PPI for most/all viewing scenarios, but anywhere between 91ppi and 110ppi is equivalent IMO). 4k would get you higher PPI, but you'd most likely want to use Windows scaling to make text large enough to read. Some software doesn't play nice with Windows DPI scaling.

On your two monitor choices. I've owned the C27HG70 since December 2017, which is the 27" version of your first link. I got it because I felt it was (still is) the best gaming monitor out there (with FreeSync). There are other manufacturers releasing monitors this year with the same Samsung panel, but I don't think they have access to QLED. The CHG70 has 40-144Hz FreeSync refresh rate and a little higher peak brightness, along with all the other same stuff that the UH750 has. The UH750 is 4k, but it's limited to 60Hz refresh so it's more of a "business" class monitor.

Oh, BTW, the C32HG70 that you linked has been as low as $540 on newegg. I've been price-watching since it came out last....July(?). I got my C27HG70 for $450

I bought the "best of the best" for the same reasons you're stating here, but like everything else, the landscape of monitors is constantly changing. Before FreeSync/GSync came out, we saw more widespread adoption of IPS panels, and larger monitors (followed by accompanying higher resolutions). In the future, we're likely to see a convergence of Variable Refresh standards (I do like that Samsung is releasing firmware updates for my CHG70 so it's possible that it will age well in this battle), as well as higher contrast from panel tech like OLED and NanoLED which will spark more widespread adoption of HDR in games,movies, and consumer software (cough-Windows10), etc etc. Today's high end, HDR certified monitors are prepared to handle this evolution of content, but they won't be the best monitor(s) forever.
Main: i5-3570K, ASRock Z77 Pro4-M, MSI RX480 8G, 500GB Crucial BX100, 2 TB Samsung EcoGreen F4, 16GB 1600MHz G.Skill @1.25V, EVGA 550-G2, Silverstone PS07B
HTPC: A8-5600K, MSI FM2-A75IA-E53, 4TB Seagate SSHD, 8GB 1866MHz G.Skill, Crosley D-25 Case Mod
 
TheEmrys
Minister of Gerbil Affairs
Posts: 2529
Joined: Wed May 29, 2002 8:22 pm
Location: Northern Colorado
Contact:

Re: True 10-bit vs 8-bit + FRC?

Fri Apr 27, 2018 12:53 pm

I would only ever consider 10 bit if you were doing professional photo work of some sort, and color accuracy was imperative. There just isn't a need for 1 billion colors over 16 million.
Sony a7II 55/1.8 Minolta 100/2, 17-35D, Tamron 28-75/2.8
 
anotherengineer
Gerbil Jedi
Posts: 1688
Joined: Fri Sep 25, 2009 1:53 pm
Location: Northern, ON Canada, Yes I know, Up in the sticks

Re: True 10-bit vs 8-bit + FRC?

Fri Apr 27, 2018 7:14 pm

I have been happy with my Benq BL2420PT for the price. True 8-bit panel, no adaptive sync or 120hz but as I said very happy with it for the price.
Life doesn't change after marriage, it changes after children!
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: True 10-bit vs 8-bit + FRC?

Fri Apr 27, 2018 7:26 pm

Aside: I've been resisting going past 1920x1080 because I've got multiple systems running through a KVM, and I'd need to upgrade everything (including the KVM) to DP. Gets kinda costly. I've addressed the screen real estate issue for my primary rig by running triple-head (with the secondary and tertiary displays not running through the KVM). But I can feel the pull of higher screen resolutions... getting spoiled by the 4K at work!
Nostalgia isn't what it used to be.
 
Ryu Connor
Global Moderator
Posts: 4369
Joined: Thu Dec 27, 2001 7:00 pm
Location: Marietta, GA
Contact:

Re: True 10-bit vs 8-bit + FRC?

Fri Apr 27, 2018 8:31 pm

Those are both slightly amped up sRGB panels. It's arguable that either of them substantially benefit from 10-bit. 10-bit is just going to make them metamer machines.

As I recall the CHG70 is still under debate about being capable of 10-bit. IIRC other companies are using that panel and saying it's just an 8-bit panel as do several spec sheets for the panel in question.
All of my written content here on TR does not represent or reflect the views of my employer or any reasonable human being. All content and actions are my own.
 
apkellogg
Gerbil Elite
Topic Author
Posts: 962
Joined: Wed Feb 25, 2004 10:15 am

Re: True 10-bit vs 8-bit + FRC?

Fri Apr 27, 2018 9:41 pm

Just wanted to say thank you to everyone for all of the info and insight that has been provided. I guess I'll have to think long and hard about what I really need/want.
 
LostCat
Minister of Gerbil Affairs
Posts: 2107
Joined: Thu Aug 26, 2004 6:18 am
Location: Earth

Re: True 10-bit vs 8-bit + FRC?

Fri Apr 27, 2018 10:27 pm

Ryu Connor wrote:
Those are both slightly amped up sRGB panels. It's arguable that either of them substantially benefit from 10-bit. 10-bit is just going to make them metamer machines.

As I recall the CHG70 is still under debate about being capable of 10-bit. IIRC other companies are using that panel and saying it's just an 8-bit panel as do several spec sheets for the panel in question.

8 bit plus if so, going by it meeting DisplayHDR 600 specs
https://displayhdr.org/performance-criteria/
Other companies that are making similar mons have only met the 400 spec, IIRC.
Meow.
 
EndlessWaves
Gerbil Team Leader
Posts: 262
Joined: Fri Jul 10, 2009 10:41 am

Re: True 10-bit vs 8-bit + FRC?

Sat Apr 28, 2018 5:06 am

And bear in mind most peoples experience of native vs. dithering would have been done on ~96dpi monitors. If you're going for a HiDPI screen like 3840x2160@32" then the pixels are even tinier and dithering using them is even less noticeable.

As others have said it's a non-issue unless you need absolute calibrated precision for professional image work.

There are plenty of much bigger potential issues with a screen like that. The ones I'd be worrying about are oversaturation in most programs from a wide gamut backlight, or the inability of a few programs to scale to a HiDPI monitor.
 
jmc2
Gerbil XP
Posts: 364
Joined: Mon Aug 22, 2011 8:30 am

Re: True 10-bit vs 8-bit + FRC?

Mon Apr 30, 2018 9:23 am

The issue here for me would be the dpi of 90+.
I can see a texture on the monitor screen from the large pixels.

Changed to a 4K screen with 150+ dpi and now the screen appears smooth!
Important for me but you may not notice.
Wish I did not because I would love to try that 32 inch CHG70.
If it was 4K I would buy it in a heartbeat!

The other thing is if HDR is or may be important to you.
Have never seen HDR in person yet but from what I understand 10 bit is important
to counter "color banding".

https://www.tomshardware.com/news/hdr-m ... 36868.html

It is early days with HDR/HDR10+/DolbyVision etc. standards and support and
will be for a good while.

jmc2
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: True 10-bit vs 8-bit + FRC?

Mon Apr 30, 2018 10:11 am

jmc2 wrote:
Changed to a 4K screen with 150+ dpi and now the screen appears smooth!
Important for me but you may not notice.

It can be a detriment if some of the software you use (and/or your OS) does not scale well to higher DPIs. Running a full-screen Linux VM on a 28" 4K display had some issues (this was the setup I used at work for about a year). Now that I'm on a 42" 4K it works better. Yeah, text is not as smooth, but the random UI scaling glitches are gone.
Nostalgia isn't what it used to be.
 
LostCat
Minister of Gerbil Affairs
Posts: 2107
Joined: Thu Aug 26, 2004 6:18 am
Location: Earth

Re: True 10-bit vs 8-bit + FRC?

Mon Apr 30, 2018 10:55 am

jmc2 wrote:
Wish I did not because I would love to try that 32 inch CHG70.
If it was 4K I would buy it in a heartbeat!

It actually does 4K over HDMI...they don't acknowledge the functionality anywhere. I assume there's a reason, but hell if I know what.
Meow.
 
jmc2
Gerbil XP
Posts: 364
Joined: Mon Aug 22, 2011 8:30 am

Re: True 10-bit vs 8-bit + FRC?

Mon Apr 30, 2018 11:34 am

just brew it! wrote:
jmc2 wrote:
Changed to a 4K screen with 150+ dpi and now the screen appears smooth!
Important for me but you may not notice.

It can be a detriment if some of the software you use (and/or your OS) does not scale well to higher DPIs. Running a full-screen Linux VM on a 28" 4K display had some issues (this was the setup I used at work for about a year). Now that I'm on a 42" 4K it works better. Yeah, text is not as smooth, but the random UI scaling glitches are gone.


Oh, it is just the actual monitor pixels themselves that I do not want to see.
I do not use or rarely use the 4K resolution. I use 1440p for my desktop.
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: True 10-bit vs 8-bit + FRC?

Mon Apr 30, 2018 11:44 am

LostCat wrote:
jmc2 wrote:
Wish I did not because I would love to try that 32 inch CHG70.
If it was 4K I would buy it in a heartbeat!

It actually does 4K over HDMI...they don't acknowledge the functionality anywhere. I assume there's a reason, but hell if I know what.

Are you sure that's 4K at a reasonable refresh rate? I've seen setups that'll do 4K over HDMI, but only if you back the refresh down to 30 Hz (which massively sucks).
Nostalgia isn't what it used to be.
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: True 10-bit vs 8-bit + FRC?

Mon Apr 30, 2018 11:50 am

jmc2 wrote:
Oh, it is just the actual monitor pixels themselves that I do not want to see.
I do not use or rarely use the 4K resolution. I use 1440p for my desktop.

OK, now I'm just confused. You shelled out the $ for a 4K monitor, but aren't using it as one? Doesn't using a 4K native display at 1440p result in scaling artifacts?

FWIW I sit far enough from my 42" 4K that I don't see individual pixels. Sitting close enough to see the individual pixels would mean sitting too close to clearly see the entire screen without constantly leaning back and forth, which would be annoying.
Nostalgia isn't what it used to be.
 
Usacomp2k3
Gerbil God
Posts: 23043
Joined: Thu Apr 01, 2004 4:53 pm
Location: Orlando, FL
Contact:

Re: True 10-bit vs 8-bit + FRC?

Mon Apr 30, 2018 12:18 pm

just brew it! wrote:
OK, now I'm just confused. You shelled out the $ for a 4K monitor, but aren't using it as one? Doesn't using a 4K native display at 1440p result in scaling artifacts?

My 3rd screen here at work s actually a 28" 2160p screen running at 1440p. It is definitely not nearly as sharp as the 2 native 1440p 27" screens next to it. It's usable though. 2160p was too small for my viewing distance ~36" for text, and the ppi between it and the other screens were giving me headachs. Running it as a tertiarty screen at a non-native text works great. I keep most of my detail work on the other two screens, so it is more reference or our RDP client for the MRP system.
 
jmc2
Gerbil XP
Posts: 364
Joined: Mon Aug 22, 2011 8:30 am

Re: True 10-bit vs 8-bit + FRC?

Mon Apr 30, 2018 3:18 pm

just brew it! wrote:
jmc2 wrote:
Oh, it is just the actual monitor pixels themselves that I do not want to see.
I do not use or rarely use the 4K resolution. I use 1440p for my desktop.

OK, now I'm just confused. You shelled out the $ for a 4K monitor, but aren't using it as one? Doesn't using a 4K native display at 1440p result in scaling artifacts?

FWIW I sit far enough from my 42" 4K that I don't see individual pixels. Sitting close enough to see the individual pixels would mean sitting too close to clearly see the entire screen without constantly leaning back and forth, which would be annoying.


I sit about 20 inches away from my old 27-28 inch Samsung 4K.
I do tend to primarily use the left half of the monitor.
Too big/too close problem but gotta be able to read.
I don't notice any artifacts but then I've had my lenses replaced and still wear glasses.
(this monitor also has the "not waking up/must power off" Samsung problem)

If I'm playing an active game I am forced to push the monitor a good bit farther away
or I will get sick. But then I'm not trying to read text.

Have wondered if something like your 42" 4K would be usable/readable and not
have to turn my head to use the whole screen. The size/distance/readable issue
would be a very personal balancing act on exactly what to get.
 
Waco
Maximum Gerbil
Posts: 4850
Joined: Tue Jan 20, 2009 4:14 pm
Location: Los Alamos, NM

Re: True 10-bit vs 8-bit + FRC?

Mon Apr 30, 2018 4:18 pm

just brew it! wrote:
LostCat wrote:
jmc2 wrote:
Wish I did not because I would love to try that 32 inch CHG70.
If it was 4K I would buy it in a heartbeat!

It actually does 4K over HDMI...they don't acknowledge the functionality anywhere. I assume there's a reason, but hell if I know what.

Are you sure that's 4K at a reasonable refresh rate? I've seen setups that'll do 4K over HDMI, but only if you back the refresh down to 30 Hz (which massively sucks).

You can run with chroma subsampling - it's generally not noticeable at all (especially in games).
Victory requires no explanation. Defeat allows none.
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: True 10-bit vs 8-bit + FRC?

Mon Apr 30, 2018 4:25 pm

jmc2 wrote:
Have wondered if something like your 42" 4K would be usable/readable and not
have to turn my head to use the whole screen. The size/distance/readable issue
would be a very personal balancing act on exactly what to get.

Yeah, it is definitely gonna depend a lot on your eyesight. I wear glasses, but lack of accommodation (presbyopia) means there's a pretty narrow sweet spot (distance-wise) where I can read both the center and edges of the display clearly. Progressive lenses (which are great for most other everyday tasks) are problematic in this situation because I get a stiff neck from having to sit with my head tilted back for extended periods.
Nostalgia isn't what it used to be.
 
LostCat
Minister of Gerbil Affairs
Posts: 2107
Joined: Thu Aug 26, 2004 6:18 am
Location: Earth

Re: True 10-bit vs 8-bit + FRC?

Mon Apr 30, 2018 10:19 pm

just brew it! wrote:
Are you sure that's 4K at a reasonable refresh rate? I've seen setups that'll do 4K over HDMI, but only if you back the refresh down to 30 Hz (which massively sucks).

It's 4K60. I assume most HDMI 2.0 mons are.
Meow.
 
Igor_Kavinski
Minister of Gerbil Affairs
Posts: 2077
Joined: Fri Dec 22, 2006 2:34 am

Re: True 10-bit vs 8-bit + FRC?

Tue May 01, 2018 4:19 am

https://www.amazon.com/LG-Electronics-O ... B073K3LPGF

I know this might be way over your budget but here are the pros:

    Huge screen
    True blacks and insane contrast with 10-bit wide color gamut almost approaching 100% of DCI-P3 colorspace
    Relatively futureproof
    Blue light protection mode
    ABL will usually ensure that you are not suddenly blinded by sudden brightness

Cons:
    Pricey but not out of reach considering the benefits
    Image retention and permanent burn-in risk but that can be mitigated by setting your screensaver to kick in every 60 seconds
 
just brew it!
Administrator
Posts: 54500
Joined: Tue Aug 20, 2002 10:51 pm
Location: Somewhere, having a beer

Re: True 10-bit vs 8-bit + FRC?

Tue May 01, 2018 6:05 am

LostCat wrote:
just brew it! wrote:
Are you sure that's 4K at a reasonable refresh rate? I've seen setups that'll do 4K over HDMI, but only if you back the refresh down to 30 Hz (which massively sucks).

It's 4K60. I assume most HDMI 2.0 mons are.

Depends on the GPU end as well. My work MBP is limited to 4K30 over HDMI; you have to use DP to get 4K60.
Nostalgia isn't what it used to be.

Who is online

Users browsing this forum: No registered users and 1 guest
GZIP: On