Personal computing discussed

Moderators: renee, Dposcorp, SpotTheCat

  • 1
  • 2
  • 3
  • 4
  • 5
  • 7
 
kamikaziechameleon
Gerbil Elite
Topic Author
Posts: 911
Joined: Wed Dec 03, 2008 3:38 pm

4K, how is this going to work???

Fri Feb 07, 2014 1:21 pm

Hey guys!

I wanted to have a thread going about how 4K is going to develop and be implemented.

In the interim we are sorta fumbling, I mean right now consumer TV's are in the lead. There are 4K 240 mhz Tv's out there that put to shame all the PC monitors striving for that.

But we don't even have 4K content yet, heck do we even have a cable or connector standard that support 4K at 240Mhz??? Currently PC's can't even put out 4K period w/o a multi-connector solution that has all kinds of problems.

Discuss.
 
danny e.
Maximum Gerbil
Posts: 4444
Joined: Thu Apr 25, 2002 3:09 pm
Location: Indonesia/Nebraska/Wisconsin

Re: 4K, how is this going to work???

Fri Feb 07, 2014 1:33 pm

Its Hz. Not MHz.
Many of the tv manufacturers refresh numbers are not really accurate.
You don't have to feel safe to feel unafraid.
 
cphite
Graphmaster Gerbil
Posts: 1202
Joined: Thu Apr 29, 2010 9:28 am

Re: 4K, how is this going to work???

Fri Feb 07, 2014 1:43 pm

kamikaziechameleon wrote:
Hey guys!

I wanted to have a thread going about how 4K is going to develop and be implemented.

In the interim we are sorta fumbling, I mean right now consumer TV's are in the lead. There are 4K 240 mhz Tv's out there that put to shame all the PC monitors striving for that.

But we don't even have 4K content yet, heck do we even have a cable or connector standard that support 4K at 240Mhz??? Currently PC's can't even put out 4K period w/o a multi-connector solution that has all kinds of problems.

Discuss.


Honestly I think 4k is overkill for most home television use, and will be for some time.

Unless you're watching on a very large screen and/or are sitting abnormally close to the screen, you aren't going to see a difference. Seriously - at around 10 feet away, with a screen of say 50 inches or so, your eye literally cannot distinguish between the 1080p and 4k. If you think yours can, you're either kidding yourself or you're from Krypton.

Also, if you're thinking of streaming... the file sizes range from two to four times that of 1080p, so if you think the compression artifacts are bad on Netflix HD now, just wait til you see Netflix 4k.
 
kamikaziechameleon
Gerbil Elite
Topic Author
Posts: 911
Joined: Wed Dec 03, 2008 3:38 pm

Re: 4K, how is this going to work???

Fri Feb 07, 2014 1:48 pm

danny e. wrote:
Its Hz. Not MHz.
Many of the tv manufacturers refresh numbers are not really accurate.

But you can't do 3D at a lower spec so how do they advertise that and what connector are they using? I didn't think HDMI could push that yet???
 
Bauxite
Gerbil Elite
Posts: 788
Joined: Sat Jan 28, 2006 12:10 pm
Location: electrolytic redox smelting plant

Still waiting for a 4k@60hz that is addressed as 1 unit

Fri Feb 07, 2014 1:49 pm

Those "240hz" TVs are taking a 24/30/60 signal and just multiplying it, or interpolating it in some cases.

Quite a few "120hz" TVs do the same thing instead of taking an actual 120hz input.

If you want a good video input standard that has been the first to support higher resolutions at faster refresh rates and has no device royalty or other dumb crap, I'll give you a hint: its not HDMI.

BTW, most 4ks for sale today either have inputs that are capped at 30hz or terrible hacks such as logically addressing the panel as two displays divided down the middle.
It seems like there is a delay in getting a proper 4k panel controller chip out to the OEMs so for now they just want to check the 4k feature box so they can sell it to people that don't know any better.
TR RIP 7/7/2019
 
kamikaziechameleon
Gerbil Elite
Topic Author
Posts: 911
Joined: Wed Dec 03, 2008 3:38 pm

Re: 4K, how is this going to work???

Fri Feb 07, 2014 1:57 pm

cphite wrote:
kamikaziechameleon wrote:
Hey guys!

I wanted to have a thread going about how 4K is going to develop and be implemented.

In the interim we are sorta fumbling, I mean right now consumer TV's are in the lead. There are 4K 240 mhz Tv's out there that put to shame all the PC monitors striving for that.

But we don't even have 4K content yet, heck do we even have a cable or connector standard that support 4K at 240Mhz??? Currently PC's can't even put out 4K period w/o a multi-connector solution that has all kinds of problems.

Discuss.


Honestly I think 4k is overkill for most home television use, and will be for some time.

Unless you're watching on a very large screen and/or are sitting abnormally close to the screen, you aren't going to see a difference. Seriously - at around 10 feet away, with a screen of say 50 inches or so, your eye literally cannot distinguish between the 1080p and 4k. If you think yours can, you're either kidding yourself or you're from Krypton.

Also, if you're thinking of streaming... the file sizes range from two to four times that of 1080p, so if you think the compression artifacts are bad on Netflix HD now, just wait til you see Netflix 4k.


I think 4K will push games more honestly, streaming wise I agree but if you look at a service like PSN were you have the option to download your rental then delete after viewing you get a blue-ray quality rental rather than the HORRIBLE streaming quality of netflix and amazon prime and youtube. I think in the living room the PSN model will be adopted by these services till infrastructure catches up.

Personally I have a 65" living room TV with 7.1 surround, its easy to see how 4K would impact TV's over 55" IMHO. I would love to have a 75 or 80" set but not in the budget at the moment.

I liked the article they did on this sight for how 4K could impact the LONG stagnant PC monitor industry. I previously did allot of work in graphics and drafting and the Dell 30" standard for the last 10 years has simply been inadequate IMHO. Pixel density = accuracy.

The fact that about a week after I got my Dell 30" I thought it was simply to low res speaks to the room for improvement there. I'm sure there are products out there that won't benefit from it but I honestly don't have any of those. I have one Big TV, not a bazillion small cheap Walmart deals and I have a high end desktop not a half dozen chrome books.
 
kamikaziechameleon
Gerbil Elite
Topic Author
Posts: 911
Joined: Wed Dec 03, 2008 3:38 pm

Re: Still waiting for a 4k@60hz that is addressed as 1 unit

Fri Feb 07, 2014 2:00 pm

Bauxite wrote:
Those "240hz" TVs are taking a 24/30/60 signal and just multiplying it, or interpolating it in some cases.

Quite a few "120hz" TVs do the same thing instead of taking an actual 120hz input.

If you want a good video input standard that has been the first to support higher resolutions at faster refresh rates and has no device royalty or other dumb crap, I'll give you a hint: its not HDMI.

BTW, most 4ks for sale today either have inputs that are capped at 30hz or terrible hacks such as logically addressing the panel as two displays divided down the middle.
It seems like there is a delay in getting a proper 4k panel controller chip out to the OEMs so for now they just want to check the 4k feature box so they can sell it to people that don't know any better.


OK, that is a real tragedy as 4K's are super expensive now, with only a few "Deals" floating around. Display port will never have a place in the living room, much as I would love it to replace HDMI. Strange to watch this horrible transition in tech. I wasn't sure that things were as bad as you said till you said it. On amazon most 4K tv's don't list any specs for port types other than , X3 HDMI...

WHAT A JOKE!

Are there any PC GPU's that can push 4K 60 hz or better on a single port? I would hope we could achieve 120 hz at the very least.
 
superjawes
Minister of Gerbil Affairs
Posts: 2475
Joined: Thu May 28, 2009 9:49 am

Re: 4K, how is this going to work???

Fri Feb 07, 2014 2:23 pm

cphite wrote:
Unless you're watching on a very large screen and/or are sitting abnormally close to the screen, you aren't going to see a difference. Seriously - at around 10 feet away, with a screen of say 50 inches or so, your eye literally cannot distinguish between the 1080p and 4k. If you think yours can, you're either kidding yourself or you're from Krypton.

Um...isn't 10 feet a pretty far distance, evern from a larger screen? If you're viewing at that distance, I imagine you're using something where the effective image size is much larger, and the higher resolution would make a bigger difference.
On second thought, let's not go to TechReport. It's infested by crypto bull****.
 
kamikaziechameleon
Gerbil Elite
Topic Author
Posts: 911
Joined: Wed Dec 03, 2008 3:38 pm

Re: 4K, how is this going to work???

Fri Feb 07, 2014 2:43 pm

superjawes wrote:
cphite wrote:
Unless you're watching on a very large screen and/or are sitting abnormally close to the screen, you aren't going to see a difference. Seriously - at around 10 feet away, with a screen of say 50 inches or so, your eye literally cannot distinguish between the 1080p and 4k. If you think yours can, you're either kidding yourself or you're from Krypton.

Um...isn't 10 feet a pretty far distance, evern from a larger screen? If you're viewing at that distance, I imagine you're using something where the effective image size is much larger, and the higher resolution would make a bigger difference.


I think I sit about 8 ft from my 65" TV.
 
cynan
Graphmaster Gerbil
Posts: 1160
Joined: Thu Feb 05, 2004 2:30 pm

Re: 4K, how is this going to work???

Fri Feb 07, 2014 2:59 pm

There are no 4K TVs that accept a 240 Hz signal, for the simple reason that HDMI 2.0 (and the timing controllers mated to it in the TV) cannot handle any greater bandwidth than 4K@60Hz. As others have mentioned, it's just interpolation, or the frequency at which the back light is pulsed.

So no, TVs are not really ahead of PC monitors.
 
cynan
Graphmaster Gerbil
Posts: 1160
Joined: Thu Feb 05, 2004 2:30 pm

Re: Still waiting for a 4k@60hz that is addressed as 1 unit

Fri Feb 07, 2014 3:04 pm

kamikaziechameleon wrote:

Are there any PC GPU's that can push 4K 60 hz or better on a single port? I would hope we could achieve 120 hz at the very least.


Any video half decent video card that has displayport 1.2 should be able to do 4K@60Hz. I think the HD 7970 was the first to be able to do this..
 
superjawes
Minister of Gerbil Affairs
Posts: 2475
Joined: Thu May 28, 2009 9:49 am

Re: Still waiting for a 4k@60hz that is addressed as 1 unit

Fri Feb 07, 2014 3:07 pm

cynan wrote:
kamikaziechameleon wrote:

Are there any PC GPU's that can push 4K 60 hz or better on a single port? I would hope we could achieve 120 hz at the very least.


Any video half decent video card that has displayport 1.2 should be able to do 4K@60Hz. I think the HD 7970 was the first to be able to do this..

Well if you're just talking about providing the monitor with 4k resolution at 60 Hz, yes. However, I think if you actually want to have unique data in that signal, or 60 FPS, I think you'll need a pair of those high-end cards in SLI or Crossfire (assuming both are using DisplayPort and not HDMI/DVI).
On second thought, let's not go to TechReport. It's infested by crypto bull****.
 
kamikaziechameleon
Gerbil Elite
Topic Author
Posts: 911
Joined: Wed Dec 03, 2008 3:38 pm

Re: 4K, how is this going to work???

Fri Feb 07, 2014 3:11 pm

http://bgr.com/2014/01/06/85-inch-sharp ... ree-8k-tv/

8K, WTF!

Seriously though not only is there no content but we don't have HDMI support for 4K at 3D frequencies yet. How is this supposed to work???

I'd just be happy with one of those 85" 4K TV's floating around on the internet.

http://www.amazon.com/s/ref=sr_nr_p_n_feature_keywords_0?rh=n%3A172282%2Cn%3A1266092011%2Cn%3A172659%2Cn%3A6459737011%2Ck%3A4K%2Cp_n_size_browse-bin%3A3578043011%2Cp_n_feature_keywords_three_browse-bin%3A7688788011&keywords=4K&ie=UTF8&qid=1391803536&rnid=7688787011
 
kamikaziechameleon
Gerbil Elite
Topic Author
Posts: 911
Joined: Wed Dec 03, 2008 3:38 pm

Re: Still waiting for a 4k@60hz that is addressed as 1 unit

Fri Feb 07, 2014 3:14 pm

superjawes wrote:
cynan wrote:
kamikaziechameleon wrote:

Are there any PC GPU's that can push 4K 60 hz or better on a single port? I would hope we could achieve 120 hz at the very least.


Any video half decent video card that has displayport 1.2 should be able to do 4K@60Hz. I think the HD 7970 was the first to be able to do this..

Well if you're just talking about providing the monitor with 4k resolution at 60 Hz, yes. However, I think if you actually want to have unique data in that signal, or 60 FPS, I think you'll need a pair of those high-end cards in SLI or Crossfire (assuming both are using DisplayPort and not HDMI/DVI).


Ok so I've devined that:

The 3D TV's that are 4K are a lie. And none of them support Display port. HDMI standards and devices on the market today can't push above 4K@30hz. Interesting. So basically 4K is still a farce, similar to what it was 12 months ago when there were TV's labled 4K that really were just AA'd 1080p monitors.
 
kamikaziechameleon
Gerbil Elite
Topic Author
Posts: 911
Joined: Wed Dec 03, 2008 3:38 pm

Re: 4K, how is this going to work???

Fri Feb 07, 2014 3:20 pm

Do we have any time tables for monitors that support true 240hz (there were some last generation)?
How about a time table for the adoption of and HDMI standard that supports true 3D TV play back by supporting 4K@240hz?
How about device (GPU, Console, BluRay Player) 4K@240hz?

We can't have content for 4K before flipping 4K devices exist.

It's pretty clear that games will be the first to move into 4K, especially PC, as they are the most "Modular" there is no need for remastering as it were.

Consoles are still not fully 1080p supportive in the new generation, that is annoying.
 
sschaem
Gerbil Team Leader
Posts: 282
Joined: Tue Oct 02, 2007 11:05 am

Re: 4K, how is this going to work???

Fri Feb 07, 2014 3:25 pm

Considering that the greater majority of 1080p content is crap (bad transfers, overly compressed / re-compressed) ... talking about moving to 4K is a bit premature.

Beside blu rays authored by true video/film professional (and not many exists), and some rare h.264 high bandwidth streaming, we have barely tapped into the 1080p potential. (and thats sad to say in 2014)

But my hope is that 4K will help make 1080p move forward. So just for that, I'm rooting hard for "4K" and I hope it happen sooner then later.
My hope is on Netflix, Vudu, and Microsoft/Sony to lead the way.
(Comcast would just use that to bump my current 37 Hispanic channels to over 200 and still find a way to broadcast the superbowl in a quality worse then the average youtube video)

So only when 4K projector hit <2000$, and I can get visual masterpieces at 4K, authored right, would I take the leap... until then, lets hope 1080p gets 'fixed'
(4K at 120"+ / 10ft with a high quality projector would be glorious)

On that note, beside the signal... many of the 1080p display on the market are kind of crappy. And many people dont have them calibrated and just watch washed out video. (no true black, and/or over bright)
 
kamikaziechameleon
Gerbil Elite
Topic Author
Posts: 911
Joined: Wed Dec 03, 2008 3:38 pm

Re: 4K, how is this going to work???

Fri Feb 07, 2014 3:33 pm

sschaem wrote:
Considering that the greater majority of 1080p content is crap (bad transfers, overly compressed / re-compressed) ... talking about moving to 4K is a bit premature.

Beside blu rays authored by true video/film professional (and not many exists), and some rare h.264 high bandwidth streaming, we have barely tapped into the 1080p potential. (and thats sad to say in 2014)

But my hope is that 4K will help make 1080p move forward. So just for that, I'm rooting hard for "4K" and I hope it happen sooner then later.
My hope is on Netflix, Vudu, and Microsoft/Sony to lead the way.
(Comcast would just use that to bump my current 37 Hispanic channels to over 200 and still find a way to broadcast the superbowl in a quality worse then the average youtube video)

So only when 4K projector hit <2000$, and I can get visual masterpieces at 4K, authored right, would I take the leap... until then, lets hope 1080p gets 'fixed'
(4K at 120"+ / 10ft with a high quality projector would be glorious)

On that note, beside the signal... many of the 1080p display on the market are kind of crappy. And many people dont have them calibrated and just watch washed out video. (no true black, and/or over bright)


EVERYTHING YOU SAY IS TRUE! I agree with what you say 100 percent. I know that consoles need the push to uni-formally support native renders at 1080p at the very least and this would push them to do that. You hit the nail on the head for good BluRay being HARD to find. There are entire online communities built up around reviewing transfer quality to BluRay 1080p and Surround Sound. Talking just about video, but audio has been stuck at 5.1 as the standard for too long. 9.2 is out there and its rare for 7.1 to be supported Natively on a movie. Jurassic Park in 7.1 was a huge thing when gladiator and braveheart BluRays are a mess(while still being really high sellers)

This is a big part of me pushing on the forefront of this stuff. Not always because we need 4K for everything but darn 1080p adoption has been slow to say the least. 4K will be a big plateau I think, similar to how 1080p was a plateau for about 4-5 years.

I've given up hope of Audio getting any sort of standard support. It seems like having any surround sound vs stereo is the dividing line in the sand. :^/
 
cynan
Graphmaster Gerbil
Posts: 1160
Joined: Thu Feb 05, 2004 2:30 pm

Re: 4K, how is this going to work???

Fri Feb 07, 2014 3:44 pm

This year looks be the year that 4K really takes off. This is largely thanks to HDMI 2.0 allowling 4k@60Hz and TV manufacturers committed to actual products that aren't just proof of concepts.

For example, Vizio is really stepping up its 4K game this year. Not only did it have the top performing high-end 4K sets with true 10 bit panels at CES, it also promised a P-series, to start at $999, that will be offered in 50" and up that will have HDMI 2.0 (4K@60Hz).

So yes, while no need yet for most to run out and get 4K due to the state of available HD content, this year looks to be the first time true 4K is a real possibility for the masses.

Personally, I'll be ready to upgrade my living room 1080p LCD that I got back in 2008 in the next year or two. And I'll be keeping an eye on models like the Vizios above.
 
Pagey
Gerbil Jedi
Posts: 1569
Joined: Thu Dec 29, 2005 10:29 am
Location: Middle TN

Re: 4K, how is this going to work???

Fri Feb 07, 2014 3:58 pm

I think just like any of the other "format wars," specs and codecs will eventually shake out to a couple of "standards". With more and more modern content consisting of a purely digital workflow (e.g., "films" shot with a digital cinematography camera such as the Arri Alexa, edited with a digital NLE (non-linear editing system), and mastered as a DI (digital intermediate), 4K is going to be the primary way motion pictures are mastered for distribution. 4K mastered content will probably continue to be down-converted to 2K for digital cinema projection and "Full HD" (1920x1080) for Blu-ray distribution for the foreseeable future. Once we have a couple of agreed upon codecs for compression (say, HEVC/H.265 or VP9), an agreed upon storage medium (say, 100GB Blu-ray discs), and an agreed upon HDCP-compliant delivery channel that supports higher refresh rates (say, HDMI X.X or whatever we end up with)...then we'll start to truly see more native 4K content for mass consumption, I think. Anything shot on 35mm or 65mm that has a good condition OCN (original camera negative) or inter-positive can be scanned at 4K and look marvelous (see Lawrence of Arabia or Patton), and I think the video philes/film lovers will double or triple dip to obtain these titles.

As for refresh rates, I don't know what the answer is. We're still hung up with all this legacy NTSC crap from the original analog broadcast days. When it was all black and white, the broadcast transmission system was synced to the 60Hz refresh rate of the power grid. TV images/frames were divided into 60 fields, with the odd transmitted first, and the even fields second, resulting in 30FPS when "recombined" on the TV. This worked, because they were only broadcasting the luma (luminance) signal, which is essentially gray scale for B&W images. Then, when NTSC color was added, they had to drop the refresh rate to 60Hz/.001 (or 59.94 fields per second aka 29.97FPS) to prevent "beading" from the new color encoding. When films were/are telecined for home media distribution, a technique called 3:2 pulldown was/is used to make the 24FPS progressive film material into 29.97FPS interlaced content. CRT TVs simply weren't progressive. Now that we're mostly on LCD devices, which are natively progressive, we can hopefully start to transition away from interlaced material. If we are talking strictly movie consumption, shot at 24FPS progressive (23.96FPS as well), then 30Hz is sufficient. But some directors want higher frame rates (see The Hobbit shot at 48PFS as an example). And some people want to use 4K devices are more than just TVs, so a higher refresh rate is desirable.

So, long story story: we'll eventually settle on a couple or so codecs for compression (for streaming and physical media); we'll settle on a type of physical media; we'll settle on an HDCP-compliant delivery spec/channel; we'll settle on the refresh rates the panels support...but this will take some time. Who knows how much?
 
kamikaziechameleon
Gerbil Elite
Topic Author
Posts: 911
Joined: Wed Dec 03, 2008 3:38 pm

Re: 4K, how is this going to work???

Fri Feb 07, 2014 4:20 pm

Pagey wrote:
I think just like any of the other "format wars," specs and codecs will eventually shake out to a couple of "standards". With more and more modern content consisting of a purely digital workflow (e.g., "films" shot with a digital cinematography camera such as the Arri Alexa, edited with a digital NLE (non-linear editing system), and mastered as a DI (digital intermediate), 4K is going to be the primary way motion pictures are mastered for distribution. 4K mastered content will probably continue to be down-converted to 2K for digital cinema projection and "Full HD" (1920x1080) for Blu-ray distribution for the foreseeable future. Once we have a couple of agreed upon codecs for compression (say, HEVC/H.265 or VP9), an agreed upon storage medium (say, 100GB Blu-ray discs), and an agreed upon HDCP-compliant delivery channel that supports higher refresh rates (say, HDMI X.X or whatever we end up with)...then we'll start to truly see more native 4K content for mass consumption, I think. Anything shot on 35mm or 65mm that has a good condition OCN (original camera negative) or inter-positive can be scanned at 4K and look marvelous (see Lawrence of Arabia or Patton), and I think the video philes/film lovers will double or triple dip to obtain these titles.

As for refresh rates, I don't know what the answer is. We're still hung up with all this legacy NTSC crap from the original analog broadcast days. When it was all black and white, the broadcast transmission system was synced to the 60Hz refresh rate of the power grid. TV images/frames were divided into 60 fields, with the odd transmitted first, and the even fields second, resulting in 30FPS when "recombined" on the TV. This worked, because they were only broadcasting the luma (luminance) signal, which is essentially gray scale for B&W images. Then, when NTSC color was added, they had to drop the refresh rate to 60Hz/.001 (or 59.94 fields per second aka 29.97FPS) to prevent "beading" from the new color encoding. When films were/are telecined for home media distribution, a technique called 3:2 pulldown was/is used to make the 24FPS progressive film material into 29.97FPS interlaced content. CRT TVs simply weren't progressive. Now that we're mostly on LCD devices, which are natively progressive, we can hopefully start to transition away from interlaced material. If we are talking strictly movie consumption, shot at 24FPS progressive (23.96FPS as well), then 30Hz is sufficient. But some directors want higher frame rates (see The Hobbit shot at 48PFS as an example). And some people want to use 4K devices are more than just TVs, so a higher refresh rate is desirable.

So, long story story: we'll eventually settle on a couple or so codecs for compression (for streaming and physical media); we'll settle on a type of physical media; we'll settle on an HDCP-compliant delivery spec/channel; we'll settle on the refresh rates the panels support...but this will take some time. Who knows how much?


LOL, you pretty much said it all.
 
kamikaziechameleon
Gerbil Elite
Topic Author
Posts: 911
Joined: Wed Dec 03, 2008 3:38 pm

Re: 4K, how is this going to work???

Fri Feb 07, 2014 4:21 pm

cynan wrote:
This year looks be the year that 4K really takes off. This is largely thanks to HDMI 2.0 allowling 4k@60Hz and TV manufacturers committed to actual products that aren't just proof of concepts.

For example, Vizio is really stepping up its 4K game this year. Not only did it have the top performing high-end 4K sets with true 10 bit panels at CES, it also promised a P-series, to start at $999, that will be offered in 50" and up that will have HDMI 2.0 (4K@60Hz).

So yes, while no need yet for most to run out and get 4K due to the state of available HD content, this year looks to be the first time true 4K is a real possibility for the masses.

Personally, I'll be ready to upgrade my living room 1080p LCD that I got back in 2008 in the next year or two. And I'll be keeping an eye on models like the Vizios above.


That is exciting. The refresh rate thing is annoying for sure but heres to hoping that gets ironed out ASAP. :)
 
jihadjoe
Gerbil Elite
Posts: 835
Joined: Mon Dec 06, 2010 11:34 am

Re: 4K, how is this going to work???

Fri Feb 07, 2014 6:40 pm

I remember reading somewhere that the BBC producer assigned to cover the last olympics actually thinks 4k isn't enough and we should skip it for 8k. His reasoning was that 8k is basically the "retina" resolution for a fairly big screen at TV viewing distances.

To back up his theory, he has 8k footage of the olympics (taken with some help from Japan, who happened to have the world's only 8k cameras at the time). Not sure if it comes with 22.2 surround to match with the audio portion of Super Hi-Vision.
Last edited by jihadjoe on Sat Feb 08, 2014 12:26 am, edited 1 time in total.
 
Bauxite
Gerbil Elite
Posts: 788
Joined: Sat Jan 28, 2006 12:10 pm
Location: electrolytic redox smelting plant

Re: 4K, how is this going to work???

Fri Feb 07, 2014 8:53 pm

Interesting, did not know about the Vizios

Not surprised though, they were the ones that brought out good full array backlit screens that didn't suck for a good price.
(160 zone 55" owner here, have yet to see anything that looks better under 3k and its several years old now, industry has regressed)

I'm really in the market for a big computer display though, I want to buy a 60hz capable 4k, do not care about 3D or "smart" garbage or even a tuner. Would be nice if I didn't have to pay for them to sit unused too.

Although if a TV managed to check those boxes at $1k even with the fluff I won't use, it would be quite tempting. I can go up to ~60" and still have proper viewing distance for computer use due to the layout of the room.
TR RIP 7/7/2019
 
CityEater
Gerbil
Posts: 80
Joined: Sun Feb 05, 2012 2:30 am

Re: 4K, how is this going to work???

Fri Feb 07, 2014 10:43 pm

Pagey wrote:
(e.g., "films" shot with a digital cinematography camera such as the Arri Alexa


Sorry to be picky but the Alexa is not a 4k camera in the traditional sense (although in development and probably partnered with another model) . Even in the world of acquisition the trend towards 4k has been slow and few of the advertised 4k (and up) cameras would resolve 4k with a good set of lenses on a resolution chart. Theres more to the equation than simply producing a frame which is 4096X2160 and tackling the front end (as in shooting the film) is still in a state of flux as film makers prioritize other things besides mere resolution. The fact is that even in the cinema the audience just doesn't have the eye to pick it. The number of released HFR productions can be counted on one hand.

If you look at this years Oscar nominees for cinematography and best picture the only ones which have a chance at claiming a true 4k deliverable are the ones shot on film (and even that's debatable once you take into account duping for release prints - barring 4k digital projection). "4k" cameras were effectively locked out of the nominations this year, which is interesting.

3840X2160 has pretty much become the defacto standard for home theatre and even that can only loosely be described as truly 4k but because its 16:9 its most likely going to become the standard for tvs. Fine. I'm not against the march of progress but for my money I would rather have a tv with better IQ and would rather there was more of an emphasis from manufacturers there than pushing more pixels on screen. A computer monitor is another equation but they're still tackling connectivity issues to feed the screen that much bandwidth and its unlikely (in my opinion) we'll see any consolidation there until a new displayport standard is developed and thats not likely to play a role in the living room. I'm more excited by the prospect of 1080p OLED tvs than I am 4k LCDs for the time being but some of you may feel differently.

As an aside I've been in post production facilities and gotten to see 4k footage projected at 4k, 2k projected at 2k (both on a 4k projector and on a 2k projector) and various 1080p up rezzes on a 13 foot screen from ~4 metres away. They all looked great and sharp and the differences were very very small (and this is in the absolute best possible environment), nothing holds a candle to getting to see a contact print (as in 1st generation) from 35mm and it wasn't resolution that made it stand out.
4k isn't a farce, its just still in development and is akin to the late nineties when HD began to be integrated into the industry. For my money it won't be what i prioritize when buying a tv but perhaps you feel differently.

As far as 10-bit becoming a home standard (or even a computer display standard) good luck. I'm willing to bet that few people have a single device in their homes which can output/display in 10 bit including your PCs. I think thats incredibly unlikely to be part of the equation for consumers into the distant future.
Bring on those bendable Samsung OLEDs I say, just don't get Michael Bay to spruik them.
 
Pagey
Gerbil Jedi
Posts: 1569
Joined: Thu Dec 29, 2005 10:29 am
Location: Middle TN

Re: 4K, how is this going to work???

Sat Feb 08, 2014 7:19 am

CityEater wrote:
Pagey wrote:
(e.g., "films" shot with a digital cinematography camera such as the Arri Alexa


Sorry to be picky but the Alexa is not a 4k camera in the traditional sense (although in development and probably partnered with another model) . Even in the world of acquisition the trend towards 4k has been slow and few of the advertised 4k (and up) cameras would resolve 4k with a good set of lenses on a resolution chart. Theres more to the equation than simply producing a frame which is 4096X2160 and tackling the front end (as in shooting the film) is still in a state of flux as film makers prioritize other things besides mere resolution. The fact is that even in the cinema the audience just doesn't have the eye to pick it. The number of released HFR productions can be counted on one hand.

If you look at this years Oscar nominees for cinematography and best picture the only ones which have a chance at claiming a true 4k deliverable are the ones shot on film (and even that's debatable once you take into account duping for release prints - barring 4k digital projection). "4k" cameras were effectively locked out of the nominations this year, which is interesting.

3840X2160 has pretty much become the defacto standard for home theatre and even that can only loosely be described as truly 4k but because its 16:9 its most likely going to become the standard for tvs. Fine. I'm not against the march of progress but for my money I would rather have a tv with better IQ and would rather there was more of an emphasis from manufacturers there than pushing more pixels on screen. A computer monitor is another equation but they're still tackling connectivity issues to feed the screen that much bandwidth and its unlikely (in my opinion) we'll see any consolidation there until a new displayport standard is developed and thats not likely to play a role in the living room. I'm more excited by the prospect of 1080p OLED tvs than I am 4k LCDs for the time being but some of you may feel differently.

As an aside I've been in post production facilities and gotten to see 4k footage projected at 4k, 2k projected at 2k (both on a 4k projector and on a 2k projector) and various 1080p up rezzes on a 13 foot screen from ~4 metres away. They all looked great and sharp and the differences were very very small (and this is in the absolute best possible environment), nothing holds a candle to getting to see a contact print (as in 1st generation) from 35mm and it wasn't resolution that made it stand out.
4k isn't a farce, its just still in development and is akin to the late nineties when HD began to be integrated into the industry. For my money it won't be what i prioritize when buying a tv but perhaps you feel differently.

As far as 10-bit becoming a home standard (or even a computer display standard) good luck. I'm willing to bet that few people have a single device in their homes which can output/display in 10 bit including your PCs. I think thats incredibly unlikely to be part of the equation for consumers into the distant future.
Bring on those bendable Samsung OLEDs I say, just don't get Michael Bay to spruik them.


Sorry, I didn't mean to imply that the Alexa was a 4K digital cinematography camera. It can capture using ARRIRAW or full 4:4:4 RGB. I was venturing a guess that many 'films' shot with the Alexa are now being mastered to a 4K DI (digital intermediate). See the technical info for Iron Man 3 as an example: http://www.imdb.com/title/tt1300854/tec ... tt_dt_spec

I should have been more clear. I blame it on all the Sierra Nevada Stout. Like you, more than additional pixels, I am much more curious to see some video for home consumption that's using something more than 4:2:0 component video with 8 bit color depth. They say the eye can't tell much of a difference when we throw away all that color with sub-sampling, but I'd love to see something with 4:2:2, at least, on a quality LCD TV at home. I do not, however, have even a guess as to how expensive it would be to produce the equipment necessary to give us a greater color bit depth for home entertainment.

However long it takes, the marketing folks have sure gotten hold of 4K as a selling point. Like the superbit DVDs of old, many Blu-rays are now actively stating on the package "Mastered in 4K". Which is fine. That just means when we finally transition to 4K (or Ultra HD, really), the movies have already been mastered at 4K and simply have to be pressed onto the media that will handle it or streamed with the appropriate high efficiency codec. The marketing folks claim that the current crop of "Mastered in 4K" Blu-rays produce better color, but as long as it's all component 4:2:0, I don't see how they can make the claim. But I am not a pro at this stuff. I only study it as a hobby because it fascinates the hell out of me. I still marvel at how groups of engineers were able to come up with the first analog NTSC encoding using a modulated radio wave!

Finally, I am looking forward to the 4K remaster of The Good, The Bad, and the Ugly due out this summer via the MGM 90th celebration. Assuming it's handled probably, I think it will be a pleasant surprise to see just how good a Techniscope-shot film can look on Blu when treated right. Let's hope it at least looks as good as the current Italian/Mondo release you can get across the pond. If the DNR it to death again, I still refuse to purchase it.
 
Chrispy_
Maximum Gerbil
Posts: 4670
Joined: Fri Apr 09, 2004 3:49 pm
Location: Europe, most frequently London.

Re: 4K, how is this going to work???

Sat Feb 08, 2014 8:41 am

4K will continue to be niche for a while, we're not even at the point in Europe where all broadcasts are HD yet, and the HD that there is is only 720p for the most part.

Internet bandwidth is an issue too. We have FTC fibre here which means most people have access to 40Mb/s or more, but it's expensive enough that the norm is still 8Mb/s internet for most people, which can barely stream 1080p.

Games won't be 4K anytime soon, I'm running a pair of 7970's and I'm struggling to hit a solid 60fps in some games even at 1440p (Crysis3, BF4). The tests at 4K so far seem to indicate that you'll need to damn-near triple your GPU throughput to get 4K running as well as 1080p, and 3x the GPU actually costs 6x as much. It seems like a waste for anyone who actually has monetary concerns, especially given that the next-gen consoles seem to be struggling to output at even 1080p at the moment - there'll likely be little to no 4K gaming content for a while.

Right now we're in an unusual situation where high-end graphics haven't really progressed much. Almost two and a half years have passed since the 7970 was the $550 flagship, and even if you buy a 290X or a 780Ti today you haven't quite managed to double the performance of that card - A 7990 is slower than two 7970s and yet it still comfortably beats the 780Ti and the 290X in Uber mode. I'm saying "comfortably" because the 7990 is 10% faster than a hot-clocked 780Ti even in games like Crysis3 and BF4 where Nvidia cards do better. In AMD-friendly game engines, the margin is more like 40%.....

So, maybe on its 3rd birthday, the 7970 will finally be only half the speed of the latest GPU: That still means that the latest, most-expensive GPU on the market will be inadequate to play even today's games at 4K, you can completely forget about games in a year from now!
Congratulations, you've noticed that this year's signature is based on outdated internet memes; CLICK HERE NOW to experience this unforgettable phenomenon. This sentence is just filler and as irrelevant as my signature.
 
cynan
Graphmaster Gerbil
Posts: 1160
Joined: Thu Feb 05, 2004 2:30 pm

Re: 4K, how is this going to work???

Sat Feb 08, 2014 1:34 pm

CityEater wrote:
As far as 10-bit becoming a home standard (or even a computer display standard) good luck. I'm willing to bet that few people have a single device in their homes which can output/display in 10 bit including your PCs. I think thats incredibly unlikely to be part of the equation for consumers into the distant future.
Bring on those bendable Samsung OLEDs I say, just don't get Michael Bay to spruik them.


Well, if I understand correctly, the Dolby Vision (yes, Dolby isn't just for surround audio standards any more, kids) standard that is just surfacing supports deeper color bit depth. This is why Vizio's (yikes, I'm beginning to sound like I work for Vizio - I don't even own a single product of theirs) Reference 4K line at CES had 10-bit panels, as they will be of the first (the first?) to apparently support the new Dolby Vision standard. If Dolby vision takes off, then 10-bit 4K panels could become ubiquitous a lot more quickly than you might think.. In related Vizio lore, they apparently needed to exert pressure on suppliers to produce components allowing 10-bit capable signal processors to make this a reality.. Given where Vizio may or may not source its parts (ie, they don't build anything in house), this may mean more availability of such components market-wide in sooner rather than later, and more 10-bit 4k panels.

Also, Panasonic's prosumer micro 4/3 DSLR line (GH series) will be capable of shooting 4k resolution with the introduction of the upcoming GH4. I think FPS will be limited to 30 though.
 
Ryu Connor
Global Moderator
Posts: 4369
Joined: Thu Dec 27, 2001 7:00 pm
Location: Marietta, GA
Contact:

Re: 4K, how is this going to work???

Sat Feb 08, 2014 3:11 pm

cynan wrote:
CityEater wrote:
As far as 10-bit becoming a home standard (or even a computer display standard) good luck. I'm willing to bet that few people have a single device in their homes which can output/display in 10 bit including your PCs. I think thats incredibly unlikely to be part of the equation for consumers into the distant future.
Bring on those bendable Samsung OLEDs I say, just don't get Michael Bay to spruik them.


Well, if I understand correctly, the Dolby Vision (yes, Dolby isn't just for surround audio standards any more, kids) standard that is just surfacing supports deeper color bit depth. This is why Vizio's (yikes, I'm beginning to sound like I work for Vizio - I don't even own a single product of theirs) Reference 4K line at CES had 10-bit panels, as they will be of the first (the first?) to apparently support the new Dolby Vision standard. If Dolby vision takes off, then 10-bit 4K panels could become ubiquitous a lot more quickly than you might think.. In related Vizio lore, they apparently needed to exert pressure on suppliers to produce components allowing 10-bit capable signal processors to make this a reality.. Given where Vizio may or may not source its parts (ie, they don't build anything in house), this may mean more availability of such components market-wide in sooner rather than later, and more 10-bit 4k panels.

Also, Panasonic's prosumer micro 4/3 DSLR line (GH series) will be capable of shooting 4k resolution with the introduction of the upcoming GH4. I think FPS will be limited to 30 though.


10bit color TV panels and output devices (PS3 for example) have been around for a while now.

http://en.wikipedia.org/wiki/XvYCC
http://en.wikipedia.org/wiki/Deep_color ... F48-bit.29

The problem is the vast majority of source video is in limited RGB. Looks like as of last year Sony started releasing a few of their films on BluRay using XvYCC instead of limited RGB.
All of my written content here on TR does not represent or reflect the views of my employer or any reasonable human being. All content and actions are my own.
 
cynan
Graphmaster Gerbil
Posts: 1160
Joined: Thu Feb 05, 2004 2:30 pm

Re: 4K, how is this going to work???

Sat Feb 08, 2014 6:00 pm

Ryu Connor wrote:
cynan wrote:
CityEater wrote:
As far as 10-bit becoming a home standard (or even a computer display standard) good luck. I'm willing to bet that few people have a single device in their homes which can output/display in 10 bit including your PCs. I think thats incredibly unlikely to be part of the equation for consumers into the distant future.
Bring on those bendable Samsung OLEDs I say, just don't get Michael Bay to spruik them.


Well, if I understand correctly, the Dolby Vision (yes, Dolby isn't just for surround audio standards any more, kids) standard that is just surfacing supports deeper color bit depth. This is why Vizio's (yikes, I'm beginning to sound like I work for Vizio - I don't even own a single product of theirs) Reference 4K line at CES had 10-bit panels, as they will be of the first (the first?) to apparently support the new Dolby Vision standard. If Dolby vision takes off, then 10-bit 4K panels could become ubiquitous a lot more quickly than you might think.. In related Vizio lore, they apparently needed to exert pressure on suppliers to produce components allowing 10-bit capable signal processors to make this a reality.. Given where Vizio may or may not source its parts (ie, they don't build anything in house), this may mean more availability of such components market-wide in sooner rather than later, and more 10-bit 4k panels.

Also, Panasonic's prosumer micro 4/3 DSLR line (GH series) will be capable of shooting 4k resolution with the introduction of the upcoming GH4. I think FPS will be limited to 30 though.


10bit color TV panels and output devices (PS3 for example) have been around for a while now.



Sure, but not for 4K TVs. Whether or not previous 4K TVs actually had 10-bit panels is irrelevant because HDMI 1.4 (and accompanying electronics in the TV) could only supply enough bandwidth for 8-bit color depth at 4K. HDMI 2 changes that. And Dolby Vision may help speed up adoption.
 
CityEater
Gerbil
Posts: 80
Joined: Sun Feb 05, 2012 2:30 am

Re: 4K, how is this going to work???

Sat Feb 08, 2014 8:57 pm

Ryu Connor wrote:
10bit color TV panels and output devices (PS3 for example) have been around for a while now.

http://en.wikipedia.org/wiki/XvYCC
http://en.wikipedia.org/wiki/Deep_color ... F48-bit.29

The problem is the vast majority of source video is in limited RGB. Looks like as of last year Sony started releasing a few of their films on BluRay using XvYCC instead of limited RGB.



You're right theres plenty of stuff which claims 10 bit capability but its a complicated thing putting together a 10-bit chain. As far as I know (and I could be wrong) at the moment you need sdi or DP to transport 10bit or a LUT box to transport over HDMI (I have to admit this is reaching the limit of my video standards knowledge) . Complicated enough that its going to be a long time before it reaches a consumer level. Consumers would be better served by tvs which held their calibration better or were self calibrating. For that matter they would be better served by decent calibration in the first place and scrapping dynamic contrast, frame interpolation modes, sales floor saturation etc etc etc. That is if your goal is accuracy.
Post houses have been using calibrated Plasmas (and more recently LCDs) as client monitors for a long time, ultimately I'm not sure how useful 10 bit is going to be in the living room when there's still life left in making 8bit equipment more accurate. A lot of that information is effectively invisible unless you're working with the footage in post. How useful is that when you're on the couch?

The GH4 is a good example of what I meant, its "4k" but only in the sense that it creates a 4096X2160 frame. If you point it at a res chart there's no way its going to resolve that much information even using its SDI out to an external recorder. Not that it doesn't look like a great bit of kit (although bulky...) but its entering a crowded market with new competitors every day. Luckily there's a pretty strong Panasonic mod/hack community so hopefully they'll embrace it fully but panasonic isn't looking to cannibalise its pro/prosumer video market with a $4k stills camera (The same situation Canon are in). The slumbering giant in the HDSLR game is really Nikon who don't have any horse in the video market so can expand their DSLR video capabilities without biting into their own higher end product range. For whatever reason they've chosen to largely ignore the HDSLR video trend and at this point may have missed the boat but who knows maybe they have something up their sleeve.

Sorry I didn't mean to make the Alexa sound inadequate, far from it. Its arguably the gold standard for digital cinema production at the moment. I've had one on my shoulder and its clearly designed by camera people for camera people, the ergonomics are great, which is more than can be said for some of the competition. In all honesty you can throw a rock and hit a great digital cinema camera at the moment, there's really great options everywhere. Its kind of a golden age for cinematographers in some ways. lots of great options.

I have the Leone box set on Bluray (huge fan), to be honest unless Criterion is doing it I wouldn't trust a remaster. I mean 2-perf is pretty grainy, Techniscope was always sort of a a compromise, my copy is already sharp. What exactly is left for them to "improve"? I fear DNR. Read about the "French Connection" remaster to see what a disaster these things can be.

If you ever get the chance, PT Anderson sometimes tours with his personal 70mm print of "The master". I haven't seen it myself but if you want the opportunity to see what some would argue is the pinnacle of picture quality take it. I missed it when it was here and I regret it.
  • 1
  • 2
  • 3
  • 4
  • 5
  • 7

Who is online

Users browsing this forum: No registered users and 14 guests
GZIP: On