whm1974 wrote:Pagey wrote:I just want to reiterate (and maybe help bring it back on topic?): the requirements to play UHD BD discs on a PC is
very exacting, and I don't see sales (at least
initial sales) of the hardware/software components necessary being strong at all. Now, if after time, I (and/or other users), happen to end up with a PC that just happens to have the DRM/chipset components necessary to play a UHD BD disc, then I
might bite. But given the option of 1.) buying a 4K TV which supports WCG and HDR plus a stand alone UHD BD player or 2.) buying a PC with the necessary DRM/chipset components, a UHD BD-ROM, and an HDR/WCG capable monitor...I'll go with option 1 at this moment in time
and for the foreseeable future.
I do believe, based on reading AVS Forum postings, there will be a handful of dedicated, hardcore AV enthusiasts that will jump on this first thing, as nearly every consumer tech has a few early adopters that pay the price and pave the way for the rest of us. God bless those folks and their wallets.
OK from the stuff I've been reading about 4K TV and computer displays over the few years, I'm under the impression that most of them
do not have WCG or HDR. Is this true? Not that I'm in the market anytime soon for anything 4K related.
Yeah, this stuff (terminology) is all over the map. For example, the 2016 line up from Samsung has a "SUHD" series that does support HDR10 and features a 10-bit panel for WCG support. They have a mid-tier line, the "KU7000/7500" series that feature "Active Crystal" display technology and
can accept an HDR signal. However, testing by Rtings.com states that the panels are 8-bit, not 10-bit. And Samsung doesn't clearly define that "Active Crystal Color" really its (I'm 99% sure it is NOT the Quantum Dot tech featured in the top tier "SUHD" lineup). So even though the panel can accept an HDR/WCG enabled signal from the source (assume set top UHD BD player), you are likely not getting true HDR and/or WCG.
Sony has an entry level X800D series that features an 8-bit + FRC panel (VA in the 43", IPS in the 49") that simulates 10-bit via dithering. The panel is their "Triluminous" display, meaning it accepts and displays both HDR and WCG (again, WCG via 8-bit + FRC to simulate 10-bit).
To further complicate matters, some manufacturers only enable 4K HDR/WCG signals on one or two of the HDMI ports (on entry level/mid-tier sets only). And, even more vexing, depending on the TVs firmware/OS, you may have to enable "Enhanced" mode on that specific HDMI source (or whatever other term the manufacturer chooses to employ). It's a bit of the wild wild west out there right now in consumer electronics land.
The good news is that the stuff I saw coming out of CES 2017 is finally starting to standardize on terminology and supported formats (e.g., HDR10 vs Dolby Vision/static HDR metadata vs dynamic HDR metadata, etc.). If I got a 2016 TV, it would be the Sony X800D series, the Samsung SUHD series, or the LG Super UHD series (starts at model 7700 and goes up). Otherwise, wait until spring when the 2017 sets appear.
Oh, and both Samsung and LG had PC monitors with Quantum Dot tech on display at CES, so hopefully we'll start seeing more and more monitors that offer this support.