Personal computing discussed
Moderators: renee, Dposcorp, SpotTheCat
kamikaziechameleon wrote:Hey guys!
I wanted to have a thread going about how 4K is going to develop and be implemented.
In the interim we are sorta fumbling, I mean right now consumer TV's are in the lead. There are 4K 240 mhz Tv's out there that put to shame all the PC monitors striving for that.
But we don't even have 4K content yet, heck do we even have a cable or connector standard that support 4K at 240Mhz??? Currently PC's can't even put out 4K period w/o a multi-connector solution that has all kinds of problems.
Discuss.
danny e. wrote:Its Hz. Not MHz.
Many of the tv manufacturers refresh numbers are not really accurate.
cphite wrote:kamikaziechameleon wrote:Hey guys!
I wanted to have a thread going about how 4K is going to develop and be implemented.
In the interim we are sorta fumbling, I mean right now consumer TV's are in the lead. There are 4K 240 mhz Tv's out there that put to shame all the PC monitors striving for that.
But we don't even have 4K content yet, heck do we even have a cable or connector standard that support 4K at 240Mhz??? Currently PC's can't even put out 4K period w/o a multi-connector solution that has all kinds of problems.
Discuss.
Honestly I think 4k is overkill for most home television use, and will be for some time.
Unless you're watching on a very large screen and/or are sitting abnormally close to the screen, you aren't going to see a difference. Seriously - at around 10 feet away, with a screen of say 50 inches or so, your eye literally cannot distinguish between the 1080p and 4k. If you think yours can, you're either kidding yourself or you're from Krypton.
Also, if you're thinking of streaming... the file sizes range from two to four times that of 1080p, so if you think the compression artifacts are bad on Netflix HD now, just wait til you see Netflix 4k.
Bauxite wrote:Those "240hz" TVs are taking a 24/30/60 signal and just multiplying it, or interpolating it in some cases.
Quite a few "120hz" TVs do the same thing instead of taking an actual 120hz input.
If you want a good video input standard that has been the first to support higher resolutions at faster refresh rates and has no device royalty or other dumb crap, I'll give you a hint: its not HDMI.
BTW, most 4ks for sale today either have inputs that are capped at 30hz or terrible hacks such as logically addressing the panel as two displays divided down the middle.
It seems like there is a delay in getting a proper 4k panel controller chip out to the OEMs so for now they just want to check the 4k feature box so they can sell it to people that don't know any better.
cphite wrote:Unless you're watching on a very large screen and/or are sitting abnormally close to the screen, you aren't going to see a difference. Seriously - at around 10 feet away, with a screen of say 50 inches or so, your eye literally cannot distinguish between the 1080p and 4k. If you think yours can, you're either kidding yourself or you're from Krypton.
superjawes wrote:cphite wrote:Unless you're watching on a very large screen and/or are sitting abnormally close to the screen, you aren't going to see a difference. Seriously - at around 10 feet away, with a screen of say 50 inches or so, your eye literally cannot distinguish between the 1080p and 4k. If you think yours can, you're either kidding yourself or you're from Krypton.
Um...isn't 10 feet a pretty far distance, evern from a larger screen? If you're viewing at that distance, I imagine you're using something where the effective image size is much larger, and the higher resolution would make a bigger difference.
kamikaziechameleon wrote:
Are there any PC GPU's that can push 4K 60 hz or better on a single port? I would hope we could achieve 120 hz at the very least.
cynan wrote:kamikaziechameleon wrote:
Are there any PC GPU's that can push 4K 60 hz or better on a single port? I would hope we could achieve 120 hz at the very least.
Any video half decent video card that has displayport 1.2 should be able to do 4K@60Hz. I think the HD 7970 was the first to be able to do this..
superjawes wrote:cynan wrote:kamikaziechameleon wrote:
Are there any PC GPU's that can push 4K 60 hz or better on a single port? I would hope we could achieve 120 hz at the very least.
Any video half decent video card that has displayport 1.2 should be able to do 4K@60Hz. I think the HD 7970 was the first to be able to do this..
Well if you're just talking about providing the monitor with 4k resolution at 60 Hz, yes. However, I think if you actually want to have unique data in that signal, or 60 FPS, I think you'll need a pair of those high-end cards in SLI or Crossfire (assuming both are using DisplayPort and not HDMI/DVI).
sschaem wrote:Considering that the greater majority of 1080p content is crap (bad transfers, overly compressed / re-compressed) ... talking about moving to 4K is a bit premature.
Beside blu rays authored by true video/film professional (and not many exists), and some rare h.264 high bandwidth streaming, we have barely tapped into the 1080p potential. (and thats sad to say in 2014)
But my hope is that 4K will help make 1080p move forward. So just for that, I'm rooting hard for "4K" and I hope it happen sooner then later.
My hope is on Netflix, Vudu, and Microsoft/Sony to lead the way.
(Comcast would just use that to bump my current 37 Hispanic channels to over 200 and still find a way to broadcast the superbowl in a quality worse then the average youtube video)
So only when 4K projector hit <2000$, and I can get visual masterpieces at 4K, authored right, would I take the leap... until then, lets hope 1080p gets 'fixed'
(4K at 120"+ / 10ft with a high quality projector would be glorious)
On that note, beside the signal... many of the 1080p display on the market are kind of crappy. And many people dont have them calibrated and just watch washed out video. (no true black, and/or over bright)
Pagey wrote:I think just like any of the other "format wars," specs and codecs will eventually shake out to a couple of "standards". With more and more modern content consisting of a purely digital workflow (e.g., "films" shot with a digital cinematography camera such as the Arri Alexa, edited with a digital NLE (non-linear editing system), and mastered as a DI (digital intermediate), 4K is going to be the primary way motion pictures are mastered for distribution. 4K mastered content will probably continue to be down-converted to 2K for digital cinema projection and "Full HD" (1920x1080) for Blu-ray distribution for the foreseeable future. Once we have a couple of agreed upon codecs for compression (say, HEVC/H.265 or VP9), an agreed upon storage medium (say, 100GB Blu-ray discs), and an agreed upon HDCP-compliant delivery channel that supports higher refresh rates (say, HDMI X.X or whatever we end up with)...then we'll start to truly see more native 4K content for mass consumption, I think. Anything shot on 35mm or 65mm that has a good condition OCN (original camera negative) or inter-positive can be scanned at 4K and look marvelous (see Lawrence of Arabia or Patton), and I think the video philes/film lovers will double or triple dip to obtain these titles.
As for refresh rates, I don't know what the answer is. We're still hung up with all this legacy NTSC crap from the original analog broadcast days. When it was all black and white, the broadcast transmission system was synced to the 60Hz refresh rate of the power grid. TV images/frames were divided into 60 fields, with the odd transmitted first, and the even fields second, resulting in 30FPS when "recombined" on the TV. This worked, because they were only broadcasting the luma (luminance) signal, which is essentially gray scale for B&W images. Then, when NTSC color was added, they had to drop the refresh rate to 60Hz/.001 (or 59.94 fields per second aka 29.97FPS) to prevent "beading" from the new color encoding. When films were/are telecined for home media distribution, a technique called 3:2 pulldown was/is used to make the 24FPS progressive film material into 29.97FPS interlaced content. CRT TVs simply weren't progressive. Now that we're mostly on LCD devices, which are natively progressive, we can hopefully start to transition away from interlaced material. If we are talking strictly movie consumption, shot at 24FPS progressive (23.96FPS as well), then 30Hz is sufficient. But some directors want higher frame rates (see The Hobbit shot at 48PFS as an example). And some people want to use 4K devices are more than just TVs, so a higher refresh rate is desirable.
So, long story story: we'll eventually settle on a couple or so codecs for compression (for streaming and physical media); we'll settle on a type of physical media; we'll settle on an HDCP-compliant delivery spec/channel; we'll settle on the refresh rates the panels support...but this will take some time. Who knows how much?
cynan wrote:This year looks be the year that 4K really takes off. This is largely thanks to HDMI 2.0 allowling 4k@60Hz and TV manufacturers committed to actual products that aren't just proof of concepts.
For example, Vizio is really stepping up its 4K game this year. Not only did it have the top performing high-end 4K sets with true 10 bit panels at CES, it also promised a P-series, to start at $999, that will be offered in 50" and up that will have HDMI 2.0 (4K@60Hz).
So yes, while no need yet for most to run out and get 4K due to the state of available HD content, this year looks to be the first time true 4K is a real possibility for the masses.
Personally, I'll be ready to upgrade my living room 1080p LCD that I got back in 2008 in the next year or two. And I'll be keeping an eye on models like the Vizios above.
Pagey wrote:(e.g., "films" shot with a digital cinematography camera such as the Arri Alexa
CityEater wrote:Pagey wrote:(e.g., "films" shot with a digital cinematography camera such as the Arri Alexa
Sorry to be picky but the Alexa is not a 4k camera in the traditional sense (although in development and probably partnered with another model) . Even in the world of acquisition the trend towards 4k has been slow and few of the advertised 4k (and up) cameras would resolve 4k with a good set of lenses on a resolution chart. Theres more to the equation than simply producing a frame which is 4096X2160 and tackling the front end (as in shooting the film) is still in a state of flux as film makers prioritize other things besides mere resolution. The fact is that even in the cinema the audience just doesn't have the eye to pick it. The number of released HFR productions can be counted on one hand.
If you look at this years Oscar nominees for cinematography and best picture the only ones which have a chance at claiming a true 4k deliverable are the ones shot on film (and even that's debatable once you take into account duping for release prints - barring 4k digital projection). "4k" cameras were effectively locked out of the nominations this year, which is interesting.
3840X2160 has pretty much become the defacto standard for home theatre and even that can only loosely be described as truly 4k but because its 16:9 its most likely going to become the standard for tvs. Fine. I'm not against the march of progress but for my money I would rather have a tv with better IQ and would rather there was more of an emphasis from manufacturers there than pushing more pixels on screen. A computer monitor is another equation but they're still tackling connectivity issues to feed the screen that much bandwidth and its unlikely (in my opinion) we'll see any consolidation there until a new displayport standard is developed and thats not likely to play a role in the living room. I'm more excited by the prospect of 1080p OLED tvs than I am 4k LCDs for the time being but some of you may feel differently.
As an aside I've been in post production facilities and gotten to see 4k footage projected at 4k, 2k projected at 2k (both on a 4k projector and on a 2k projector) and various 1080p up rezzes on a 13 foot screen from ~4 metres away. They all looked great and sharp and the differences were very very small (and this is in the absolute best possible environment), nothing holds a candle to getting to see a contact print (as in 1st generation) from 35mm and it wasn't resolution that made it stand out.
4k isn't a farce, its just still in development and is akin to the late nineties when HD began to be integrated into the industry. For my money it won't be what i prioritize when buying a tv but perhaps you feel differently.
As far as 10-bit becoming a home standard (or even a computer display standard) good luck. I'm willing to bet that few people have a single device in their homes which can output/display in 10 bit including your PCs. I think thats incredibly unlikely to be part of the equation for consumers into the distant future.
Bring on those bendable Samsung OLEDs I say, just don't get Michael Bay to spruik them.
CityEater wrote:As far as 10-bit becoming a home standard (or even a computer display standard) good luck. I'm willing to bet that few people have a single device in their homes which can output/display in 10 bit including your PCs. I think thats incredibly unlikely to be part of the equation for consumers into the distant future.
Bring on those bendable Samsung OLEDs I say, just don't get Michael Bay to spruik them.
cynan wrote:CityEater wrote:As far as 10-bit becoming a home standard (or even a computer display standard) good luck. I'm willing to bet that few people have a single device in their homes which can output/display in 10 bit including your PCs. I think thats incredibly unlikely to be part of the equation for consumers into the distant future.
Bring on those bendable Samsung OLEDs I say, just don't get Michael Bay to spruik them.
Well, if I understand correctly, the Dolby Vision (yes, Dolby isn't just for surround audio standards any more, kids) standard that is just surfacing supports deeper color bit depth. This is why Vizio's (yikes, I'm beginning to sound like I work for Vizio - I don't even own a single product of theirs) Reference 4K line at CES had 10-bit panels, as they will be of the first (the first?) to apparently support the new Dolby Vision standard. If Dolby vision takes off, then 10-bit 4K panels could become ubiquitous a lot more quickly than you might think.. In related Vizio lore, they apparently needed to exert pressure on suppliers to produce components allowing 10-bit capable signal processors to make this a reality.. Given where Vizio may or may not source its parts (ie, they don't build anything in house), this may mean more availability of such components market-wide in sooner rather than later, and more 10-bit 4k panels.
Also, Panasonic's prosumer micro 4/3 DSLR line (GH series) will be capable of shooting 4k resolution with the introduction of the upcoming GH4. I think FPS will be limited to 30 though.
Ryu Connor wrote:cynan wrote:CityEater wrote:As far as 10-bit becoming a home standard (or even a computer display standard) good luck. I'm willing to bet that few people have a single device in their homes which can output/display in 10 bit including your PCs. I think thats incredibly unlikely to be part of the equation for consumers into the distant future.
Bring on those bendable Samsung OLEDs I say, just don't get Michael Bay to spruik them.
Well, if I understand correctly, the Dolby Vision (yes, Dolby isn't just for surround audio standards any more, kids) standard that is just surfacing supports deeper color bit depth. This is why Vizio's (yikes, I'm beginning to sound like I work for Vizio - I don't even own a single product of theirs) Reference 4K line at CES had 10-bit panels, as they will be of the first (the first?) to apparently support the new Dolby Vision standard. If Dolby vision takes off, then 10-bit 4K panels could become ubiquitous a lot more quickly than you might think.. In related Vizio lore, they apparently needed to exert pressure on suppliers to produce components allowing 10-bit capable signal processors to make this a reality.. Given where Vizio may or may not source its parts (ie, they don't build anything in house), this may mean more availability of such components market-wide in sooner rather than later, and more 10-bit 4k panels.
Also, Panasonic's prosumer micro 4/3 DSLR line (GH series) will be capable of shooting 4k resolution with the introduction of the upcoming GH4. I think FPS will be limited to 30 though.
10bit color TV panels and output devices (PS3 for example) have been around for a while now.
Ryu Connor wrote:10bit color TV panels and output devices (PS3 for example) have been around for a while now.
http://en.wikipedia.org/wiki/XvYCC
http://en.wikipedia.org/wiki/Deep_color ... F48-bit.29
The problem is the vast majority of source video is in limited RGB. Looks like as of last year Sony started releasing a few of their films on BluRay using XvYCC instead of limited RGB.