1440p vs 1080p

Building a new system? Need help choosing between parts? Then step in and let our trained gerbils assist you.

Moderator: JustAnEngineer

1440p vs 1080p

Postposted on Mon Dec 02, 2013 2:45 am

I'm having trouble deciding between 1080p or 1440p for my new monitor (i am building an entire new system, not just new monitor). Obviously 1440p is better but I am not going to buy a $500+ video card so my thinking is that 1080p super smooth with everything at max is a better gaming experience than 1440p at medium quality with the occasional frame drop.

I've more or less decided on a 4GB GTX770 which i find to be reasonably future-proof but not ridiculously expensive. I dont have a lot of time to play games but when I do I want a great experience. If it matters I dont play twitch-games/shooters much, I prefer rpgs or hybrids (think Mass Effect, Deus Ex, Skyrim, etc.).

So, any opinions if it is worth it to go for a 1440p? Is a GTX770 powerful enough to drive one?
ztrand
Gerbil
 
Posts: 97
Joined: Sat Jun 24, 2006 6:54 am
Location: Europe

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 3:17 am

I have a GTX 670 driving a 2560x1440 monitor. It does well, but you're not going to be able to max out demanding games. Many people claim that they need less AA at 2560x1440 because the pixel pitch is lower, but I personally haven't tested this out.

For day to day activities, its nice to have 27'' of space.
kumori
Gerbil Team Leader
Silver subscriber
 
 
Posts: 271
Joined: Sat Dec 17, 2011 11:11 pm

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 3:21 am

Because of memory bandwidth limitations, it seems like the return on investment in a 4GB variant of the 770 is less than you might think. I'm not an expert in such things, though.

The 770 seems to perform acceptably at 1440p. Anandtech threw it at various games on ultra/very high settings at 1440p and it seems to stay above 50 fps for the most part.

But I don't have a 770 (yet, it's on the way) so I don't have any firsthand experience that I can share.
weaktoss
Gerbil
 
Posts: 28
Joined: Fri Jun 07, 2013 12:09 pm

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 3:24 am

770 is alright for 4Megapixel gaming with modern titles, but you are going to have to do some compromises in order to keep a smooth framerate (less AA, lower in-game details, less AF etc.)

If you all of the eye candy with AA/AF enabled then you need to shoot for something more like 780 and 290.
Ivy Bridge i5-3570K@4.0Ghz, Gigabyte Z77X-UD3H, 2x4GiB of PC-12800, EVGA 660Ti, Corsair CX-600 and Fractal Refined R4 (W). Kentsfield Q6600@3Ghz, HD 4850 2x2GiB PC2-6400, Gigabyte EP45-DS4P, OCZ Modstream 700W, and PC-7B.
Krogoth
Maximum Gerbil
Silver subscriber
 
 
Posts: 4380
Joined: Tue Apr 15, 2003 2:20 pm
Location: somewhere on Core Prime

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 3:42 am

thanks for the comments!

I hadnt really considered bandwidth limitations for the 4GB variant, thats a good point. Also, I don't demand 60+ FPS all of the time, i'm content with 30+ (I think) so maybe the 770 will be ok for 1440p. But I really dont like that "maybe" :-)

I'll probably buy a 1080p screen. Or maybe just go ahead and throw more money on the problem and buy a more powerful card. I'm buying high end for the rest of the system so maybe it is just stupid to skimp on the video card.
ztrand
Gerbil
 
Posts: 97
Joined: Sat Jun 24, 2006 6:54 am
Location: Europe

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 6:17 am

I am running Dual 27"1440 screens with a GeForce 250
If you are playing OnLine games ( WoW, Everquest, etc )the Video Card will do Great.

There are people who seem to have unlimited money and or eagle sharp eyes They will swear you need Quad 780's to play candy crush
Since you said that the maxed out bleeding edge is not your thing, The video card will not be that important
wkstar
Gerbil In Training
 
Posts: 3
Joined: Wed Jul 09, 2008 10:41 pm

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 6:23 am

ztrand wrote:I'll probably buy a 1080p screen. Or maybe just go ahead and throw more money on the problem and buy a more powerful card. I'm buying high end for the rest of the system so maybe it is just stupid to skimp on the video card.
A 770 is hardly "skimping", but I know what you mean. For around the same price as a 770 you can pick up a Radeon 280X, which is less likely to fall victim to bandwidth constraints, and more able to make use of its larger 3GB memory, which may prove relevant (along with Mantle support, perhaps) in the coming year or two.

My personal advice would be to get the 770 or a 280X and a nice 1080p screen like the EIZO Foris FG2421. Gaming at higher resolutions is very nice, but most of your 2560x1440 monitors are somewhat suboptimal for gaming. This is just my opinion, of course. I expect any moment now we'll have Chrispy_ or JustAnEngineer in here to scream BLASPHEMY!

Still, I think you'd profit more from an FG2421 or even an ASUS VG248QE or BenQ XL2420Z than from a 1440p display. 1080p 120Hz vs. 1440p 60Hz is about the same number of pixels in a second, after all!
wkstar wrote:If you are playing OnLine games ( WoW, Everquest, etc )the Video Card will do Great.
That's a nice strawman you set up in the end of your post there, but nobody says you need quad 780s to play Candy Crush. You are clearly happy with playing games at their minimum settings and playing truly ancient games -- the rest of us would like some eye candy, please.
i5-3570K @ 4.4 (NH-C14), 4x8GB DDR3-1866, GA-Z68MA-D3H-B2, ASUS GTXTITAN-6GD5, 128GB Vertex 4 / 2x60GB Vertex Plus R2 / 2x2TB Barracuda 7200.14 RAID0 / ANS-9010 (4x4GB), SST-DA1000 (PSU), 2x VS229H-P, 1x VG248QE, 1x MIMO 720F, Corsair Vengeance K90+M95
auxy
Gerbil Elite
 
Posts: 779
Joined: Sat Jan 19, 2013 3:25 pm
Location: the armpit of Texas

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 6:30 am

wkstar wrote:I am running Dual 27"1440 screens with a GeForce 250
If you are playing OnLine games ( WoW, Everquest, etc )the Video Card will do Great.

There are people who seem to have unlimited money and or eagle sharp eyes They will swear you need Quad 780's to play candy crush
Since you said that the maxed out bleeding edge is not your thing, The video card will not be that important


Thanks for the reply. Although I didnt really mean bleeding edge is not my thing, I just meant *twitch* gaming is not my thing. I do play the latest (non-twitch) games and I want them on highest settings. This may include shooter-type games, but just not super-intense multiplayer etc. Appreciate the WoW-info, I have been known to play it periodically.
ztrand
Gerbil
 
Posts: 97
Joined: Sat Jun 24, 2006 6:54 am
Location: Europe

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 6:37 am

I'd say your real choice should be between TN vs. IPS. If you have a higher resolution monitor, it can still run at non-native resolutions, and you can always turn down a couple settings in favor of keeping a whole lot of detail. Since you're probably going to use the monitor longer than the hardware in your computer, though (assuming you get something that will last many years), having more pixels now will be a better investment so you're not replacing said monitor the next time you replace your PC.

In the long run, however, viewing angles and color reproduction can get to you, and this is where IPS displays are going to shine. I've used my monitors for everything, and I personally wish I had the IPS display provided for me at work.

Last note, and this is probably the biggest one, check out G-Sync and consider waiting until these monitors hit the market before you buy one now. The thing is, G-Sync should handle reconstruction better than anything currently on the market, and frame rates across the board will look better. Obviously getting the maximum FPS allowed by the maximum refresh rate will be best, but tearing will be gone and stuttering should also be much less pronounced.

Personally, I would get a solid 700 series Nvidia card now and pick up a G-Sync capable monitor in Spring-Summer 2014. If they end up being awful in the price/performance (even though there aren't a lot of monitor benchmarks out there), you can still get a more typical monitor.
Save yourself the trouble and don't ask why the TV remote is in the refrigerator.
superjawes
Gerbil Elite
Gold subscriber
 
 
Posts: 975
Joined: Thu May 28, 2009 8:49 am

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 6:38 am

auxy wrote:A 770 is hardly "skimping", but I know what you mean. For around the same price as a 770 you can pick up a Radeon 280X, which is less likely to fall victim to bandwidth constraints, and more able to make use of its larger 3GB memory, which may prove relevant (along with Mantle support, perhaps) in the coming year or two.

Yeah, a 770 is hardly low budget territory but i can get everything pretty high-end for less than $1000. Putting $300 or so into the video card feels ok, but once we get up to $500+ it just feels silly for a single card even if I can afford it... I used to buy ATI up until the 4870 but I get the impression they have been losing quality in their drivers lately so I'd prefer nVidia this time. I also might want to run Linux on it eventually (maybe even SteamOS) so that is one more reason for nVidia.
auxy wrote:My personal advice would be to get the 770 or a 280X and a nice 1080p screen like the EIZO Foris FG2421. Gaming at higher resolutions is very nice, but most of your 2560x1440 monitors are somewhat suboptimal for gaming. This is just my opinion, of course. I expect any moment now we'll have Chrispy_ or JustAnEngineer in here to scream BLASPHEMY!

Still, I think you'd profit more from an FG2421 or even an ASUS VG248QE or BenQ XL2420Z than from a 1440p display. 1080p 120Hz vs. 1440p 60Hz is about the same number of pixels in a second, after all!

I am leaning towards a good 1080p like you suggested. I think If i get more time for gaming in the future I may want to try some 3-screen setup which certainly wont be doable for the 770 at 1440p.
ztrand
Gerbil
 
Posts: 97
Joined: Sat Jun 24, 2006 6:54 am
Location: Europe

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 6:45 am

superjawes wrote:I'd say your real choice should be between TN vs. IPS. If you have a higher resolution monitor, it can still run at non-native resolutions, and you can always turn down a couple settings in favor of keeping a whole lot of detail. Since you're probably going to use the monitor longer than the hardware in your computer, though (assuming you get something that will last many years), having more pixels now will be a better investment so you're not replacing said monitor the next time you replace your PC.


I'm pretty set on IPS. On my last full-system build I spent almost $1000 on the just the screen. I have used it for years and havent regretted it once. So part of me really want a good 1440p, but OTOH it feels a bit wasteful considering how little I game with it. I'm also pretty sure now I would need a more expensive video card for it.

And as you say, the G-sync stuff is also on it's way which further complicates things...
ztrand
Gerbil
 
Posts: 97
Joined: Sat Jun 24, 2006 6:54 am
Location: Europe

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 6:49 am

superjawes wrote:I'd say your real choice should be between TN vs. IPS. If you have a higher resolution monitor, it can still run at non-native resolutions, and you can always turn down a couple settings in favor of keeping a whole lot of detail. Since you're probably going to use the monitor longer than the hardware in your computer, though (assuming you get something that will last many years), having more pixels now will be a better investment so you're not replacing said monitor the next time you replace your PC.
VA panels provide 5x the contrast of the best IPS panels.

Just saying.
ztrand wrote:I'm pretty set on IPS. On my last full-system build I spent almost $1000 on the just the screen. I have used it for years and havent regretted it once. So part of me really want a good 1440p, but OTOH it feels a bit wasteful considering how little I game with it. I'm also pretty sure now I would need a more expensive video card for it.

And as you say, the G-sync stuff is also on it's way which further complicates things...
G-Sync will not be on an IPS monitor any time soon, nor will truly native 120hz. There are some IPS that can be overclocked to 120hz (and higher), but they look rubbish because the panel's response time can't keep up. If you want IPS, forget about G-Sync, backlight strobing ("LightBoost"), and high refresh rates.
i5-3570K @ 4.4 (NH-C14), 4x8GB DDR3-1866, GA-Z68MA-D3H-B2, ASUS GTXTITAN-6GD5, 128GB Vertex 4 / 2x60GB Vertex Plus R2 / 2x2TB Barracuda 7200.14 RAID0 / ANS-9010 (4x4GB), SST-DA1000 (PSU), 2x VS229H-P, 1x VG248QE, 1x MIMO 720F, Corsair Vengeance K90+M95
auxy
Gerbil Elite
 
Posts: 779
Joined: Sat Jan 19, 2013 3:25 pm
Location: the armpit of Texas

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 6:56 am

auxy wrote:G-Sync will not be on an IPS monitor any time soon, nor will truly native 120hz. There are some IPS that can be overclocked to 120hz (and higher), but they look rubbish because the panel's response time can't keep up. If you want IPS, forget about G-Sync, backlight strobing ("LightBoost"), and high refresh rates.

Why no G-sync on IPS? I am aware there probably wont be any high refresh rate IPS, but I cant see why there would be a problem with G-sync?
ztrand
Gerbil
 
Posts: 97
Joined: Sat Jun 24, 2006 6:54 am
Location: Europe

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 7:03 am

ztrand wrote:
auxy wrote:G-Sync will not be on an IPS monitor any time soon, nor will truly native 120hz. There are some IPS that can be overclocked to 120hz (and higher), but they look rubbish because the panel's response time can't keep up. If you want IPS, forget about G-Sync, backlight strobing ("LightBoost"), and high refresh rates.

Why no G-sync on IPS? I am aware there probably wont be any high refresh rate IPS, but I cant see why there would be a problem with G-sync?

Well ...

You're right, there's no technical reason someone couldn't make an IPS G-Sync monitor.

However, G-Sync is being marketed as a "gamer" feature and "gamers" (stereotypically speaking -- that is, the ones likely to buy into marketing) have no idea what kind of panel is in their monitor, and IPS panels *are* more expensive, unless you go for one of the cheapie e-IPS, and at that point, why bother?

In other words, I don't think it will happen. Nobody that has announced G-Sync support is known for their IPS monitors save for ASUS, and they've only announced it on their 120Hz displays.
i5-3570K @ 4.4 (NH-C14), 4x8GB DDR3-1866, GA-Z68MA-D3H-B2, ASUS GTXTITAN-6GD5, 128GB Vertex 4 / 2x60GB Vertex Plus R2 / 2x2TB Barracuda 7200.14 RAID0 / ANS-9010 (4x4GB), SST-DA1000 (PSU), 2x VS229H-P, 1x VG248QE, 1x MIMO 720F, Corsair Vengeance K90+M95
auxy
Gerbil Elite
 
Posts: 779
Joined: Sat Jan 19, 2013 3:25 pm
Location: the armpit of Texas

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 7:18 am

$332 +18½ shipping or $399 +17¼ shipping for a 2560x1440 IPS LCD monitor isn't going to break your budget.

P.S.: It's down to $300 with free shipping here:
http://slickdeals.net/permadeal/108092/ ... im-monitor
i7-4770K, H70, Gryphon Z87, 16 GiB, R9-290, SSD, 2 HD, Blu-ray, SB ZX, TJ08-E, SS-660XP², 3007WFP+2001FP, RK-9000BR, MX518
JustAnEngineer
Gerbil God
Gold subscriber
 
 
Posts: 15131
Joined: Sat Jan 26, 2002 6:00 pm
Location: The Heart of Dixie

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 7:19 am

auxy wrote:However, G-Sync is being marketed as a "gamer" feature and "gamers" (stereotypically speaking -- that is, the ones likely to buy into marketing) have no idea what kind of panel is in their monitor, and IPS panels *are* more expensive, unless you go for one of the cheapie e-IPS, and at that point, why bother?

In other words, I don't think it will happen. Nobody that has announced G-Sync support is known for their IPS monitors save for ASUS, and they've only announced it on their 120Hz displays.


Ah ok I see what you mean. To be honest G-Sync is too far off right now for me anyway since I am buying this system before christmas. However it might be a reason not to go all out on the screen/videocard combo. I might go 1080p + 770 now and upgrade in a year or so if i really want g-sync.

Sigh... this thread has so far just highlighted my problems in choosing. I guess there is no good answer, just "depends"... Thanks for all the replies though :-)
ztrand
Gerbil
 
Posts: 97
Joined: Sat Jun 24, 2006 6:54 am
Location: Europe

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 7:23 am

JustAnEngineer wrote:$332 +18½ shipping or $399 +17¼ shipping for a 2560x1440 IPS LCD monitor isn't going to break your budget.


Thanks for the links. Unfortunately my issue isn't really monitor price/model but rather if going 1440p forces me to buy something better than the gtx770 :) I think it does but maybe I should just go ahead and get a 780ti or something silly. Unfortunately i like to consider myself "non-sucker" and paying 100% more for 25% performance just feels wrong.
ztrand
Gerbil
 
Posts: 97
Joined: Sat Jun 24, 2006 6:54 am
Location: Europe

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 7:29 am

I found another $50 savings on the monitor. That ought to leave enough funds available to get to the performance level of a Radeon R9-290 or GeForce GTX780.
http://techreport.com/review/25611/nvid ... eviewed/12
i7-4770K, H70, Gryphon Z87, 16 GiB, R9-290, SSD, 2 HD, Blu-ray, SB ZX, TJ08-E, SS-660XP², 3007WFP+2001FP, RK-9000BR, MX518
JustAnEngineer
Gerbil God
Gold subscriber
 
 
Posts: 15131
Joined: Sat Jan 26, 2002 6:00 pm
Location: The Heart of Dixie

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 7:34 am

JustAnEngineer wrote:I found another $50 savings on the monitor. That ought to leave enough funds available to get to the performance level of a Radeon R9-290 or GeForce GTX780.
http://techreport.com/review/25611/nvid ... eviewed/12

Hm. That really is a good deal. Seems screens are slightly cheaper than I'd expected. I suppose I could squeeze in a 780. More to think about, thanks again!
ztrand
Gerbil
 
Posts: 97
Joined: Sat Jun 24, 2006 6:54 am
Location: Europe

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 7:49 am

JustAnEngineer wrote:I found another $50 savings on the monitor. That ought to leave enough funds available to get to the performance level of a Radeon R9-290 or GeForce GTX780.
http://techreport.com/review/25611/nvid ... eviewed/12

That's a really nice price on that monitor! I wouldn't want to game on it tho.
i5-3570K @ 4.4 (NH-C14), 4x8GB DDR3-1866, GA-Z68MA-D3H-B2, ASUS GTXTITAN-6GD5, 128GB Vertex 4 / 2x60GB Vertex Plus R2 / 2x2TB Barracuda 7200.14 RAID0 / ANS-9010 (4x4GB), SST-DA1000 (PSU), 2x VS229H-P, 1x VG248QE, 1x MIMO 720F, Corsair Vengeance K90+M95
auxy
Gerbil Elite
 
Posts: 779
Joined: Sat Jan 19, 2013 3:25 pm
Location: the armpit of Texas

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 8:15 am

ztrand wrote:
auxy wrote:G-Sync will not be on an IPS monitor any time soon, nor will truly native 120hz. There are some IPS that can be overclocked to 120hz (and higher), but they look rubbish because the panel's response time can't keep up. If you want IPS, forget about G-Sync, backlight strobing ("LightBoost"), and high refresh rates.

Why no G-sync on IPS? I am aware there probably wont be any high refresh rate IPS, but I cant see why there would be a problem with G-sync?

That's just initially. G-Sync, or something like it, will eventually make it to every display because it also solves issues where refresh rates don't match the source FPS (movies, for example, are 24 FPS while most LCD TVs are 60 Hz).

And right now, G-Sync relies on a proprietary FPGA from Nvidia, which carries a price premium. That means that putting it on an IPS display would raise the price more and further limit who could and would adopt. Once they get things worked out, they can shift into replacing ASICs inside the monitors, and that will not only reduce cost, but it should allow AMD GPUs access to the technology, as well as anything else driving a display.

I know that doesn't make anything easier, but it is good perspective.
Save yourself the trouble and don't ask why the TV remote is in the refrigerator.
superjawes
Gerbil Elite
Gold subscriber
 
 
Posts: 975
Joined: Thu May 28, 2009 8:49 am

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 10:51 am

superjawes wrote:That's just initially. G-Sync, or something like it, will eventually make it to every display because it also solves issues where refresh rates don't match the source FPS (movies, for example, are 24 FPS while most LCD TVs are 60 Hz).
This is optimistic, and I hope you're right, but the cynic in me says it will never see adoption and LCDs will be superseded by another display technology before G-Sync (or similar) becomes widespread.
i5-3570K @ 4.4 (NH-C14), 4x8GB DDR3-1866, GA-Z68MA-D3H-B2, ASUS GTXTITAN-6GD5, 128GB Vertex 4 / 2x60GB Vertex Plus R2 / 2x2TB Barracuda 7200.14 RAID0 / ANS-9010 (4x4GB), SST-DA1000 (PSU), 2x VS229H-P, 1x VG248QE, 1x MIMO 720F, Corsair Vengeance K90+M95
auxy
Gerbil Elite
 
Posts: 779
Joined: Sat Jan 19, 2013 3:25 pm
Location: the armpit of Texas

Re: 1440p vs 1080p

Postposted on Mon Dec 02, 2013 11:05 am

auxy wrote:
superjawes wrote:That's just initially. G-Sync, or something like it, will eventually make it to every display because it also solves issues where refresh rates don't match the source FPS (movies, for example, are 24 FPS while most LCD TVs are 60 Hz).
This is optimistic, and I hope you're right, but the cynic in me says it will never see adoption and LCDs will be superseded by another display technology before G-Sync (or similar) becomes widespread.

Worst case is that we *only* get dual refresh rates, so the end result will be 48/60 Hz monitors and TVs that switch depending on whether the media is 24 or 30 FPS (or 48 and 60 FPS, too).

But technically speaking, allowing the device to dictate when the display refreshes implements those multiple refresh rates, so we get that benefit while also improving gaming graphics (which are rendered on the spot).

And once this is in an ASIC format, even if you expect the worst from Nvidia, they can't bork it for AMD with a software update. AMD will just need to learn how to activate it, and then optimize their drivers so that they make full use of G-Sync. They could still do that with the FPGA implementation, but if Nvidia can change it with software, it makes it a floating point to hit.

By the way, having a completely variable refresh rate also means that you don't have to worry about nasty issues like XBOnes not outputing at 50 Hz...

EDIT: Essentially, if Nvidia tries to lock down G-Sync, my response is "Good luck with that."
Save yourself the trouble and don't ask why the TV remote is in the refrigerator.
superjawes
Gerbil Elite
Gold subscriber
 
 
Posts: 975
Joined: Thu May 28, 2009 8:49 am

Re: 1440p vs 1080p

Postposted on Tue Dec 03, 2013 4:55 am

auxy wrote:This is optimistic, and I hope you're right, but the cynic in me says it will never see adoption and LCDs will be superseded by another display technology before G-Sync (or similar) becomes widespread.


As I understand it G-Sync solves a problem not limited to just LCDs. All display technologies suffer from not having perfect sync between graphics buffer update and screen refresh. So even if LCDs are superseded by something better, the idea behind G-Sync still applies so it should be useful.
ztrand
Gerbil
 
Posts: 97
Joined: Sat Jun 24, 2006 6:54 am
Location: Europe

Re: 1440p vs 1080p

Postposted on Tue Dec 03, 2013 9:43 am

ztrand wrote:
auxy wrote:This is optimistic, and I hope you're right, but the cynic in me says it will never see adoption and LCDs will be superseded by another display technology before G-Sync (or similar) becomes widespread.


As I understand it G-Sync solves a problem not limited to just LCDs. All display technologies suffer from not having perfect sync between graphics buffer update and screen refresh. So even if LCDs are superseded by something better, the idea behind G-Sync still applies so it should be useful.

Not *all* display technologies. Specifically, OLEDs, which have no "refresh rate", per se.
i5-3570K @ 4.4 (NH-C14), 4x8GB DDR3-1866, GA-Z68MA-D3H-B2, ASUS GTXTITAN-6GD5, 128GB Vertex 4 / 2x60GB Vertex Plus R2 / 2x2TB Barracuda 7200.14 RAID0 / ANS-9010 (4x4GB), SST-DA1000 (PSU), 2x VS229H-P, 1x VG248QE, 1x MIMO 720F, Corsair Vengeance K90+M95
auxy
Gerbil Elite
 
Posts: 779
Joined: Sat Jan 19, 2013 3:25 pm
Location: the armpit of Texas

Re: 1440p vs 1080p

Postposted on Tue Dec 03, 2013 10:52 am

I am mot 100% sure but i think OLEDs have a hard set refresh rate just like LCDs
ztrand
Gerbil
 
Posts: 97
Joined: Sat Jun 24, 2006 6:54 am
Location: Europe

Re: 1440p vs 1080p

Postposted on Tue Dec 03, 2013 11:36 am

Actually, neither LCDs or OLEDs have a "refresh rate." They have reaction times that can result in "refresh rates," but they don't need to update at fixed intervals. You see, having discrete connections between your source (GPU, DVD player, etc.) and individual pixels is impractical, and would require millions of circuits. So instead, the display scans, only changing a few pixels at a time. This allows cables to carry all of that informations with relatively few wires. Some protocols are better than others because they allow for higher data rates. An example of that is 4k displays, which either need a pair of DVI cables or a single DisplayPort cable.

This is partially a carry over from the CRT days, but you see it all the time in electrical engineering. Now what makes G-Sync so great is that is uses the fact that, as already established, OLEDs and LCD displays don't need to refresh at fixed rates, which allows the display to match the source (like 24 fps for movies). It basically allows the "display data rate" to match whatever the rate is coming out of the GPU, and the GPU doesn't discard data partway through, which tears the image.

For additional information, this will create two potential bottlenecks for the system. The first is the maxmimum "display data rate," which is the resolution @ maximum refresh rate (like 1080p @ 60 Hz). So raising the refresh rate to 120 Hz can improve results by allowing more information at the end of the process, but the system will also be limited by the protocol. If you can only get 1080p @ 60 FPS over HDMI, then you can't expect anything better than that at the display. And yes, you can improve that with better protocols (or more cables :wink: ).
Save yourself the trouble and don't ask why the TV remote is in the refrigerator.
superjawes
Gerbil Elite
Gold subscriber
 
 
Posts: 975
Joined: Thu May 28, 2009 8:49 am

Re: 1440p vs 1080p

Postposted on Tue Dec 03, 2013 11:39 am

superjawes wrote:Actually, neither LCDs or OLEDs have a "refresh rate." (etc)
Thanks for saving me the trouble of explaining! Have some tea. 且⊂(゚∀゚*)
superjawes wrote:If you can only get 1080p @ 60 FPS over HDMI, then you can't expect anything better than that at the display.
And this is why everyone should be using DisplayPort!(she says, with two displays connected via DVI and one via VGA.)

By the way, even OLEDs can still have motion blur. I'm eager to see how G-Sync and strobing interplay.
i5-3570K @ 4.4 (NH-C14), 4x8GB DDR3-1866, GA-Z68MA-D3H-B2, ASUS GTXTITAN-6GD5, 128GB Vertex 4 / 2x60GB Vertex Plus R2 / 2x2TB Barracuda 7200.14 RAID0 / ANS-9010 (4x4GB), SST-DA1000 (PSU), 2x VS229H-P, 1x VG248QE, 1x MIMO 720F, Corsair Vengeance K90+M95
auxy
Gerbil Elite
 
Posts: 779
Joined: Sat Jan 19, 2013 3:25 pm
Location: the armpit of Texas

Re: 1440p vs 1080p

Postposted on Tue Dec 03, 2013 11:49 am

auxy wrote:And this is why everyone should be using DisplayPort!(she says, with two displays connected via DVI and one via VGA.)

Unfortunately, I doubt this will happen. Even though DisplayPort has a higher data rate, and even though HDMI Founders get royalties for everything HDMI, television tech dictates what we get in displays :(
Save yourself the trouble and don't ask why the TV remote is in the refrigerator.
superjawes
Gerbil Elite
Gold subscriber
 
 
Posts: 975
Joined: Thu May 28, 2009 8:49 am


Return to System Builders Anonymous

Who is online

Users browsing this forum: No registered users and 3 guests