G-Sync Monitors?

What you see is what you get, including photography, displays, and video equipment.

Moderators: Dposcorp, SpotTheCat

G-Sync Monitors?

Postposted on Sat Oct 19, 2013 10:33 am

http://rog.asus.com/267372013/gaming/nv ... -monitors/

What are everyone's thoughts on becoming an early adopter of this technology?
Intel Core i7-875K, Asus P7P55D-E Pro, Win 7 Home Premium
MSI GTX 560 Ti OC, Mushkin 2x2GB DDR3-1333, Corsair TX650
Cooler Master Hyper 212+, Logitech Z-2300, ASUS Xonar DX
Samsung Spinpoint F3 1TB, Dell Ultrasharp U2410, Antec P183
DeadOfKnight
Gerbil Elite
Gold subscriber
 
 
Posts: 634
Joined: Tue May 18, 2010 1:20 pm

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 10:48 am

As I said in comments - I would gladly switch my monitors if it does make a noticeable difference.
My subscription allows you people to exist on this site and makes me a better human being than you'll ever be
JohnC
Gerbil Jedi
Gold subscriber
 
 
Posts: 1861
Joined: Fri Jan 28, 2011 2:08 pm
Location: NY/NJ/FL

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 11:01 am

I'd gladly switch if it doesn't cost me $140 over current ones...$40 maybe.
Meow.
Savyg
Gerbil Elite
Silver subscriber
 
 
Posts: 605
Joined: Thu Aug 26, 2004 6:18 am
Location: Between desert and tundra

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 11:16 am

Ambivilent. I don't think my GPU is even compatible, so that will need to come first.
This doesn't seem like the type of thing you should adopt early anyway, in a few years that chip might become practically standard, after-all its not like its a complicated chip.
Diplomacy42
Gerbil
 
Posts: 63
Joined: Sat Sep 01, 2012 2:56 pm

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 11:17 am

If it can't match the image quality of my existing UltraSharp 3007WFP, I'm not changing.
i7-4770K, H70, Gryphon Z87, 16 GiB, R9-290, SSD, 2 HD, Blu-ray, SB ZX, TJ08-E, SS-660XP², 3007WFP+2001FP, RK-9000BR, MX518
JustAnEngineer
Gerbil God
Gold subscriber
 
 
Posts: 15320
Joined: Sat Jan 26, 2002 7:00 pm
Location: The Heart of Dixie

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 11:26 am

I am hoping the DIY kits will be extended to other monitors by either NV or a third party. I don't want to have to compromise image quality by using a TN 'gaming monitor' which this will inevitably come out for stock.
MadManOriginal
Graphmaster Gerbil
 
Posts: 1403
Joined: Wed Jan 30, 2002 7:00 pm
Location: In my head...

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 11:42 am

http://techreport.com/news/25531/nvidia ... 033#769033
Chrispy_ wrote: Here's a great example of why TN sucks.
+1.
i7-4770K, H70, Gryphon Z87, 16 GiB, R9-290, SSD, 2 HD, Blu-ray, SB ZX, TJ08-E, SS-660XP², 3007WFP+2001FP, RK-9000BR, MX518
JustAnEngineer
Gerbil God
Gold subscriber
 
 
Posts: 15320
Joined: Sat Jan 26, 2002 7:00 pm
Location: The Heart of Dixie

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 11:55 am

JustAnEngineer wrote:http://techreport.com/news/25531/nvidia-g-sync-matches-lcd-refresh-rate-to-gpu-render-rate?post=769033#769033
Chrispy_ wrote: Here's a great example of why TN sucks.
+1.

I would not suggest using a TN panel for everyday use. I am suggesting that for gaming it might be a great option. Still, no one has said that G-Sync will not be available on IPS monitors.
Last edited by DeadOfKnight on Sat Oct 19, 2013 12:02 pm, edited 2 times in total.
Intel Core i7-875K, Asus P7P55D-E Pro, Win 7 Home Premium
MSI GTX 560 Ti OC, Mushkin 2x2GB DDR3-1333, Corsair TX650
Cooler Master Hyper 212+, Logitech Z-2300, ASUS Xonar DX
Samsung Spinpoint F3 1TB, Dell Ultrasharp U2410, Antec P183
DeadOfKnight
Gerbil Elite
Gold subscriber
 
 
Posts: 634
Joined: Tue May 18, 2010 1:20 pm

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 11:59 am

MadManOriginal said exactly that in the post to which I responded.
i7-4770K, H70, Gryphon Z87, 16 GiB, R9-290, SSD, 2 HD, Blu-ray, SB ZX, TJ08-E, SS-660XP², 3007WFP+2001FP, RK-9000BR, MX518
JustAnEngineer
Gerbil God
Gold subscriber
 
 
Posts: 15320
Joined: Sat Jan 26, 2002 7:00 pm
Location: The Heart of Dixie

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 12:03 pm

JustAnEngineer wrote:MadManOriginal said exactly that in the post to which I responded.


Personally I'd like to see a 120Hz IPS screen. I'd also like to see a 23"-24" 1440p monitor. However, there is no denying the value of a variable refresh rate and no tearing. I might get one of these to pair with my IPS panel.

I'm one of those that turns v-sync on because tearing really bugs the crap out of me even more than lag and FPS jumping around, so maybe I'm an odd case. I wanted a second screen anyway and this is worth an extra $100.
Intel Core i7-875K, Asus P7P55D-E Pro, Win 7 Home Premium
MSI GTX 560 Ti OC, Mushkin 2x2GB DDR3-1333, Corsair TX650
Cooler Master Hyper 212+, Logitech Z-2300, ASUS Xonar DX
Samsung Spinpoint F3 1TB, Dell Ultrasharp U2410, Antec P183
DeadOfKnight
Gerbil Elite
Gold subscriber
 
 
Posts: 634
Joined: Tue May 18, 2010 1:20 pm

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 2:04 pm

I would also love a 24" 1440p monitor, G-Sync or not. Though with, preferably.
I do not understand what I do. For what I want to do, I do not do. But what I hate, I do.
derFunkenstein
Gerbil God
Gold subscriber
 
 
Posts: 21252
Joined: Fri Feb 21, 2003 9:13 pm
Location: WHAT?

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 2:20 pm

derFunkenstein wrote:I would also love a 24" 1440p monitor, G-Sync or not. Though with, preferably.

A 24" 120Hz 1440p IPS monitor, with a GPU-agnostic G-Sync. I can dream.

These companies will tell us we want though. What do we know? Heh.
Intel Core i7-875K, Asus P7P55D-E Pro, Win 7 Home Premium
MSI GTX 560 Ti OC, Mushkin 2x2GB DDR3-1333, Corsair TX650
Cooler Master Hyper 212+, Logitech Z-2300, ASUS Xonar DX
Samsung Spinpoint F3 1TB, Dell Ultrasharp U2410, Antec P183
DeadOfKnight
Gerbil Elite
Gold subscriber
 
 
Posts: 634
Joined: Tue May 18, 2010 1:20 pm

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 3:18 pm

I would have if I had not just bought a 7950, probably. I was going to do a monitor upgrade in the next year or so, but I think I'll wait now until G-Sync gets more adoption and/or AMD offers an alternative.

At least this way I can upgrade my GPU in two years or so and get a G-Sync capable monitor at the same time.
Being an adult doesn't mean you have to know what you're doing. It just means you have to look like you know what you're doing.
superjawes
Graphmaster Gerbil
Gold subscriber
 
 
Posts: 1027
Joined: Thu May 28, 2009 9:49 am

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 4:08 pm

Yes, please. I'm already in the market for new 24"+ IPS panels, so if I can find a monitor that support this, I'll have to seriously consider it.

I've mentioned my opinion about how GUIs could be better with higher refresh rates and pixel densities in other threads and comments. Letting the video card drive the refresh rate of the monitor is a good idea, and now a blatantly obvious engineering decision.
Flatland_Spider
Gerbil Elite
 
Posts: 808
Joined: Mon Sep 13, 2004 8:33 pm
Location: The 918/539

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 4:57 pm

Diplomacy42 wrote:This doesn't seem like the type of thing you should adopt early anyway, in a few years...

:lol: Of course! If you're planning on living forever - I suggest just waiting for direct neural interface or something, you might save even more moneys that way :wink:

DeadOfKnight wrote:I'm one of those that turns v-sync on because tearing really bugs the crap out of me even more than lag and FPS jumping around, so maybe I'm an odd case

Nah, you're not "odd" - I always force "Adaptive Vsync" in driver's control panel. I haven't seen any problems with that, even with bunch of kiddies shouting "OMG, VSYNC gives lags!" or "OMG, Adaptive Vsync gives tearing!!!!1111" on various FPS forums :wink:
My subscription allows you people to exist on this site and makes me a better human being than you'll ever be
JohnC
Gerbil Jedi
Gold subscriber
 
 
Posts: 1861
Joined: Fri Jan 28, 2011 2:08 pm
Location: NY/NJ/FL

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 5:00 pm

Doesn’t triple buffering basically do a similar thing? It just requires more memory on the card. So instead of upping memory on their cards, now we have to update all our monitors. Also suspect that the NVidia G-sync monitors will not play nice with AMD cards. Don’t think this tech is compatible with my three 580’s right now anyway so I'd need to update cards as well. Sounds like a pretty costly upgrade for minor improvement.
Waiting for the day when Maxwell based cards become available with the shared memory like the current crop of consoles have. By the time Maxwell cards are out 4K will be more mainstream and a bit cheaper and figured I’d go for the 4K display. I’ve been saving up for the massive cost of the monitor and the cards that will be required to drive it since I bought my 580’s. Plus by then maybe OLED may be coming on strong and they have an very high refresh rate.
I just wonder if the added price of the display and cards, it would be just better to invest that money on a better graphics card that could pump out higher frame rates.
yammerpickle2
Gerbil
 
Posts: 13
Joined: Thu Apr 01, 2010 12:33 am

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 8:20 pm

JustAnEngineer wrote:http://techreport.com/news/25531/nvidia-g-sync-matches-lcd-refresh-rate-to-gpu-render-rate?post=769033#769033
Chrispy_ wrote: Here's a great example of why TN sucks.
+1.

Yah, my VG248QE actually shows arguably LESS warping of the lagom.nl test pattern than my eIPS VS229H-P side monitors, which have zero vertical gamma shift -- but a noticeable horizontal shift. Curious.

By the way, how does your monitor do on this test?.

Here, I took some pictures with my girlfriend's phone. They're pretty bad quality, but I took them from more or less exactly where I sit while using it. Have a look!:Picture 1, Picture 2 There's some shifting in gamma, but no color shift, and overall it's really not bad. Again, it's overall similar to the eIPS displays I have, which cost a lot less, but also don't do >60Hz or LightBoost (or DisplayPort!)
i5-3570K @ 4.4 (NH-C14), 4x8GB DDR3-1866, GA-Z68MA-D3H-B2, ASUS GTXTITAN-6GD5, 128GB Vertex 4 / 2x60GB Vertex Plus R2 / 2x2TB Barracuda 7200.14 RAID0 / ANS-9010 (4x4GB), SST-DA1000 (PSU), 2x VS229H-P, 1x VG248QE, 1x MIMO 720F, Corsair Vengeance K90+M95
auxy
Gerbil Elite
 
Posts: 781
Joined: Sat Jan 19, 2013 4:25 pm
Location: the armpit of Texas

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 9:31 pm

60Hz, in situations where your graphics card can't output frames at a contant 60fps:
with Vsync on, your GPU finishes drawing a frame to the buffer ready for the monitor, and the monitor has to wait until it's next refresh (up to 1/60th of a second = 16.6ms) On average, that means you're getting an additional 8.3ms of lag at 60Hz (since, the delay will be somewhere between 0 and 16.6ms). In this example where the GPU can't fill the buffer fast enough for the 16.6ms refresh period, you will have to wait an additional 16.6ms, which means that you're actually averaging 25ms of input lag (16.6+8.3) and frame intervals of up to 33.3ms (1/30th of a second).

Now, it varies from person to person, but 1/30th of a second is too long to fool the brain into perceiving motion. It's still watchable at 30fps (movies are 24fps) but the brain struggles to fill in the gaps beween seperate images if there's too much motion and the difference between the two images is too great. "Fluidity" for most people is somewhere between 35Hz (29ms) and 50Hz (20ms). "Fluidity" as I'm calling it is NOT the same thing as the maximum refresh rate your eyes/brain can process. For me at least, that number is about three times greater.


120Hz, in situations where your graphics card can't output frames at a contant 60fps:
Your GPU finishes drawing a frame to the buffer ready for the monitor, and the monitor has to wait until it's next refresh (up to 1/120th of a second = 8.3ms) The improved refresh rate means you're averaging 4.2ms of lag from the refresh interval alone, and another 8.3ms if the GPU skips a frame because it can't keep up - bringing the average input lag down to 12.5ms (8.3+4.2).

Here is where it gets funky; At exactly 60fps, we're missing every other refresh to deliver frames every 16.6ms but at less than a constant 60fps the frame takes more than 16.6ms. In fact, the GPU has up to 25ms to draw to the buffer before missing the refresh for a second time, rather than 33.3ms before the second refresh when running the monitor at only 60Hz. the shorter interval between second refresh reduces the missed-interval penalty by 8.3ms, and bringing the longest frame interval down to 25ms, which is important because it's the number that falls comfortably within the 35-50Hz range that most people consider to be fluid.


G-Sync, in situations where your graphics card can't output frames at a contant 60fps:
Your GPU finishes drawing a frame to the buffer ready for the monitor, and the monitor doesn't have to wait at all. The input lag in the system is entirely the game engine and the rendering time, which will always be present, plus any input lag the display processing the image causes. These types of lag are always present though and the context we're talking about - lag caused by vsync and refresh intervals- the value is zero. NO LAG. Using the median 25ms value that can be considered "fluid" by most people, the graphics card can deliver framerates as low as 40fps and still look good, still with ZERO LAG.


G-Sync in a nutshell:
Well, even at <60fps G-Sync does two things: It reduces the input lag, and increases the amount of time a GPU can spend rendering a frame before frames exceed the 25ms threshold that we (I, at least) don't like. It brings the fluidity benefits of a 120Hz display to sub-60fps gaming and it manages to reduce imput lag to the same level as without vsync. Playing with Vsync turned off, but cranking detail levels high is like an exercise in pointlessness. Why go to all the effort of making it look pretty if the monitor rips the dolled-up frame clean in two the minute there's any motion?

Nvidia have worked it out (which is annoying, because Nvidia like to keep things to themselves rather than drive open standards for the industry): We don't need ludicrious 120, 144 or even 240Hz monitors. What we need is as much detail as possible at a framerate that's fluid. Given the choice of gaming with 'ultra settings' at 50Hz (a constant 50fps) or 'medium settings' at 144Hz, I'd definitely go with the higher detail at the lower framerate. Once a game looks fluid, cranking out the frames faster is wasted effort that could be better spent on making the frames prettier/more detailed.


Off topic a bit:
I have actually measured the rate at which I perceive fluidity and it's 43Hz for me. I won't bore you with the science of how that was calculated, but it's important to know that it is an average; The brain takes longer to process and low-contrast and darker images, In those instances, 30Hz can seem fluid. In candyland-style brightly contrasting images, 60Hz may not be enough. This is arguably why flicker-free monitors required at least 72Hz. The strobing CRT scan was effectively a very bright, very high contrast white band moving down the screen - the worst case scenario of bright and contrast-y requiring higher frequencies to fool the brain. I digress though, I am very happy with my screen running at 85Hz. For the same reason as a 120Hz screen, it brings the dropped-frame interval from 33,3ms down to just 23.5ms - and that happens to be about 43Hz, just about enough to fool my brain. It's obviously not a constant 60fps, but it's better than the yo-yoing between 60fps and 30fps every few frames.

Oh, and congratulations if you read this far. I think I've worked off my foolish alcohol-fueled insomnia, which was the main point of this post ;)
<insert large, flashing, epileptic-fit-inducing signature (based on the latest internet-meme) here>
Chrispy_
Gerbil Jedi
Gold subscriber
 
 
Posts: 1755
Joined: Fri Apr 09, 2004 3:49 pm

Re: G-Sync Monitors?

Postposted on Sat Oct 19, 2013 10:44 pm

Alcohol always makes forum posts better :).

But you're right, though I needed at least 85Hz on CRTs to eliminate 'flashing'. I could readily tell you if a CRT was running at 60Hz, 75Hz, or 85Hz- and I could still see improvements at 100Hz and 120Hz. Good fun.

Thankfully, LCD's present an image that doesn't have to constantly 'refresh'.
Canon 6D||[24-105/4L IS USM|100/2.8L Macro IS USM|70-300/4-5.6 IS USM|40/2.8 STM|50/1.4 USM|85/1.8 USM|Samyang/Bower 14/2.8 Full-Manual Rectilinear Wide-angle|
Canon EOS-M|11-22/4-5.6 IS STM|22/2 STM|EF-M 18-55/3.5-5.6 IS STM|
For sale!|24/2.8 IS USM
|
Airmantharp
Maximum Gerbil
 
Posts: 4962
Joined: Fri Oct 15, 2004 10:41 pm

Re: G-Sync Monitors?

Postposted on Sun Oct 20, 2013 12:25 am

Chrispy_ wrote:This is arguably why flicker-free monitors required at least 72Hz.
Airmantharp wrote:But you're right, though I needed at least 85Hz on CRTs to eliminate 'flashing'. I could readily tell you if a CRT was running at 60Hz, 75Hz, or 85Hz- and I could still see improvements at 100Hz and 120Hz. Good fun.
I couldn't see flashing on a 60hz CRT. Still can't, in fact; I have one hooked up to my newly-built Linux gateway -- an old IBM Aptiva CRT monitor. I *could* tell you, in a game, approximately what the MaxFPS was set to, with the display set to 120Hz refresh, though; we tested that quite thoroughly. I guess my eyes are tuned for high sensitivity to fast motion with no sensitivity to flicker.
Airmantharp wrote:Thankfully, LCDs present an image that doesn't have to constantly 'refresh'.
I'm not all that thankful for it. Strobing does resolve the problem, though.

To be on topic, I can't really get excited about Gsync. Unless I'm totally wrong and the improvement in fluid motion and CLEAR motion is better than that from Lightboost, the exclusivity between Gsync and Lightboost means I'm always going to choose the latter.

It's a really neat concept and all, but I don't play games under 60fps anyway, so it doesn't matter that much to me.

By the way, for those of you hoping this will work or 'get hacked to work' with AMD cards, it's somewhat unlikely --
PC Perspective wrote:This technology will be available soon on Kepler-based GeForce graphics cards but will require a monitor with support for G-Sync; not just any display will work. The first launch monitor is a variation on the very popular 144 Hz ASUS VG248QE 1920x1080 display and as we saw with 3D Vision, supporting G-Sync will require licensing and hardware changes. In fact, NVIDIA claims that the new logic inside the panels controller is NVIDIA's own design - so you can obviously expect this to only function with NVIDIA GPUs.
That was mostly folks in the comment thread, not in here, but it bears spreading the word. G-Sync requires custom logic inside the monitor and I wouldn't put it past NVIDIA to check some kind of vendor ID in the GPU.
i5-3570K @ 4.4 (NH-C14), 4x8GB DDR3-1866, GA-Z68MA-D3H-B2, ASUS GTXTITAN-6GD5, 128GB Vertex 4 / 2x60GB Vertex Plus R2 / 2x2TB Barracuda 7200.14 RAID0 / ANS-9010 (4x4GB), SST-DA1000 (PSU), 2x VS229H-P, 1x VG248QE, 1x MIMO 720F, Corsair Vengeance K90+M95
auxy
Gerbil Elite
 
Posts: 781
Joined: Sat Jan 19, 2013 4:25 pm
Location: the armpit of Texas

Re: G-Sync Monitors?

Postposted on Sun Oct 20, 2013 1:06 am

They also stated that on the GPU side that it's simply a modification of the DP packets, so assuming that it's not somehow artificially locked down, it could be implemented in other GPUs in the future, and possibly hacked (officially or not) into current GPUs not made by Nvidia.

As for the usefulness vs. Lightboost- I guess it depends. Neither are terribly likely to arrive on a monitor I'd actually be interested (large, high-DPI, color accurate...), but I think I'd rather have G-Sync, overall. Lightboost does nothing for input lag or tearing, it just attacks perceived panel ghosting, which is less useful to me.
Canon 6D||[24-105/4L IS USM|100/2.8L Macro IS USM|70-300/4-5.6 IS USM|40/2.8 STM|50/1.4 USM|85/1.8 USM|Samyang/Bower 14/2.8 Full-Manual Rectilinear Wide-angle|
Canon EOS-M|11-22/4-5.6 IS STM|22/2 STM|EF-M 18-55/3.5-5.6 IS STM|
For sale!|24/2.8 IS USM
|
Airmantharp
Maximum Gerbil
 
Posts: 4962
Joined: Fri Oct 15, 2004 10:41 pm

Re: G-Sync Monitors?

Postposted on Sun Oct 20, 2013 5:49 am

Strobing backlights are great, but they only fix the sample-and-hold motion blur. Even a "slow" 6ms panel exhibits less visible ghosting than the 1-frame blur of sample-and-hold.

If we were still playing fast platform scrollers like Sonic the Hedgehog all day, this would be essential, but most games actually intentionally blur movement for "realism" because this is how our eyes work, making strobes a nice theoretical but barely relevant feature to modern gaming.
<insert large, flashing, epileptic-fit-inducing signature (based on the latest internet-meme) here>
Chrispy_
Gerbil Jedi
Gold subscriber
 
 
Posts: 1755
Joined: Fri Apr 09, 2004 3:49 pm

Re: G-Sync Monitors?

Postposted on Sun Oct 20, 2013 5:51 am

auxy wrote:To be on topic, I can't really get excited about Gsync. Unless I'm totally wrong and the improvement in fluid motion and CLEAR motion is better than that from Lightboost, the exclusivity between Gsync and Lightboost means I'm always going to choose the latter.

I was watching the LinusTechTips Twitch livestream of this and John Carmack was talking about using both of them together. Then he got a "look" from the Nvidia guys and admitted that this was not yet a supported feature, but he seemed excited at the prospect of using them both together. Basically, it sounds to me like this technology still has room to mature, so being an early adopter may or may not be the best idea. Still very very exciting stuff though, in my opinion.
Intel Core i7-875K, Asus P7P55D-E Pro, Win 7 Home Premium
MSI GTX 560 Ti OC, Mushkin 2x2GB DDR3-1333, Corsair TX650
Cooler Master Hyper 212+, Logitech Z-2300, ASUS Xonar DX
Samsung Spinpoint F3 1TB, Dell Ultrasharp U2410, Antec P183
DeadOfKnight
Gerbil Elite
Gold subscriber
 
 
Posts: 634
Joined: Tue May 18, 2010 1:20 pm

Re: G-Sync Monitors?

Postposted on Sun Oct 20, 2013 6:57 am

Chrispy_ wrote:Strobing backlights are great, but they only fix the sample-and-hold motion blur. Even a "slow" 6ms panel exhibits less visible ghosting than the 1-frame blur of sample-and-hold.

If we were still playing fast platform scrollers like Sonic the Hedgehog all day, this would be essential, but most games actually intentionally blur movement for "realism" because this is how our eyes work, making strobes a nice theoretical but barely relevant feature to modern gaming.
You are -- I can't -- this is ridiculous. If you said this to my face, I would jump so I could reach and knock you outbop your nose with a squeaky mallet. You might as well have just said "Nightvision is a cool theoretical but barely relevant feature to modern warfare." No, it's a huge advantage, and so is Lightboost. ヽ(`Д´#)ノ

Games implement motion blur because they refuse to run at 60hz on consoles, instead preferring to shoot for 25-30FPS to make for nicer screenshots. 25-30FPS looks awful, so they use motion blur to try and hide the effect. It still looks ridiculous and unrealistic, because it IS unrealistic. Your eyes blur moving images when they aren't focused on them. Artificially blurring an image which you ARE focused on is distracting and counterproductive. All pro-gamers turn off motion blur. The image on screen should stay clear and steady as it does in real life, and Lightboost allows this to happen.

It's clear as day -- compare a screen at 120hz, then the same screen at 120Hz + Lightboost. I've had the fortunate situation to compare them with the same video side-by-side, and it's not even a comparison. It's not even worth talking about. Once you see it, everything else looks like stuttery garbage.

Go play a game with Lightboost enabled and then come back and tell me this. I can't believe your gall. I'm so furious right now! (ノ`Д´)ノ彡┻━┻
i5-3570K @ 4.4 (NH-C14), 4x8GB DDR3-1866, GA-Z68MA-D3H-B2, ASUS GTXTITAN-6GD5, 128GB Vertex 4 / 2x60GB Vertex Plus R2 / 2x2TB Barracuda 7200.14 RAID0 / ANS-9010 (4x4GB), SST-DA1000 (PSU), 2x VS229H-P, 1x VG248QE, 1x MIMO 720F, Corsair Vengeance K90+M95
auxy
Gerbil Elite
 
Posts: 781
Joined: Sat Jan 19, 2013 4:25 pm
Location: the armpit of Texas

Re: G-Sync Monitors?

Postposted on Sun Oct 20, 2013 7:18 am

I am using a 37" vizio TV and I do not think it is a TN panel....I have a 47" vizio TV hooked to another computer and I do not believe it is a TN panel either.

I am thinking IPS with big pixels but I could be wrong...but the color seems correct , does not shift and they have good viewing angles.

The main thing besides having a decent picture......neither of them have any noticeable LAG of any kind. Nor does my plasma.

Hey I have noticed a Huge increase in quality watching my movies and tv shows @ 60FPS over 24fps......when the wife notices the difference you know it is doing something good. Thanks to the Smooth Video project....on another note turn everything up to the max on SVP and it will bring my i7 2600k to its knees...I think I would have been ok with a overclock but i just transplanted it into my 7750 h67 chipset and that bios wont let me do anything!! I had a i3 2125 in my 560ti SLI rig and it felt fine. Now mt 2600k and 2125 are in the correct homes, running a 2600k at stock speeds is just wrong. 4.6ghz with 1.350 volts is ok until I get a 240mm rad. My 120mm is doing a good job"even caked with a good bit of dust. I know ill vacuum the rad...I just wanted to get my 2600k in its correct home.

Ohh it was time for a paste change temps have dropped considerably 5-10c on both machines with artic silver 4 or 5 i forget but i need to get more TIM. This tube I used I have no Idea where and when it was from! Since i had a tube of white ceramic paste also from my fx-53 940 pin days.
2600k HT on@4705mhz 8gb Cas9 1600 mem 2x EVGA GTX770 4gb Classified cards in SLI @1320 mhz core and 2003 mhz mem,mounted in CM HAF922 with a TX-850 PSU 2xHTPC's 2xi3 2120 3.3ghz dual core,1xasus LP HD6570 1xHIS hd7750@1150core1325mem,55"PanyVT30
vargis14
Graphmaster Gerbil
 
Posts: 1102
Joined: Fri Aug 20, 2010 6:03 pm
Location: philly suburbs

Re: G-Sync Monitors?

Postposted on Sun Oct 20, 2013 8:46 am

auxy, I'm planning a trip west, to shoot landscapes of west Texas and to meet a few people along the way. I think I need to stop by and buy you lunch, if only so I can understand your emoticons :).
Canon 6D||[24-105/4L IS USM|100/2.8L Macro IS USM|70-300/4-5.6 IS USM|40/2.8 STM|50/1.4 USM|85/1.8 USM|Samyang/Bower 14/2.8 Full-Manual Rectilinear Wide-angle|
Canon EOS-M|11-22/4-5.6 IS STM|22/2 STM|EF-M 18-55/3.5-5.6 IS STM|
For sale!|24/2.8 IS USM
|
Airmantharp
Maximum Gerbil
 
Posts: 4962
Joined: Fri Oct 15, 2004 10:41 pm

Re: G-Sync Monitors?

Postposted on Sun Oct 20, 2013 10:00 am

auxy wrote:
Chrispy_ wrote:Strobing backlights are great, but they only fix the sample-and-hold motion blur
RAAAAAAaaaaaaaggggggee


Seriously. Show me the bit where (after you had a ragefit) you found and posted a factual link to show that strobing backlights do anything other than fix sample-and-hold blur.

Also your fundamental reasoning behind motion blur is completely wrong; motion blur effects in games do not blur moving objects that are static relative to your focal point. It seems you don't really understand how human vision works.
<insert large, flashing, epileptic-fit-inducing signature (based on the latest internet-meme) here>
Chrispy_
Gerbil Jedi
Gold subscriber
 
 
Posts: 1755
Joined: Fri Apr 09, 2004 3:49 pm

Re: G-Sync Monitors?

Postposted on Sun Oct 20, 2013 1:03 pm

auxy wrote:...it bears spreading the word. G-Sync requires custom logic inside the monitor and I wouldn't put it past NVIDIA to check some kind of vendor ID in the GPU.

Well, couple notes on "custom logic":

If it does end up 100% proprietary to Nvidia, someone like AMD could probably license it and access the internal logic to enable G-Sync on an AMD card. That sucks for AMD and consumers, but the important part is not the GPU and the system overall, but getting the monitor into a state where the GPU can control the refresh rate, essentially putting the monitor into a "refresh off" state with a "refresh enable" signal triggering a scan and displaying the new frame. As long as a monitor has that, AMD can set up their own logic through drivers and/or architecture for their own solution.

However, I do not think that the logic is terribly complicated, and now that the idea is out there, monitor venders really only need to offer a way to trigger this "refresh off" state that GPUs can tap into. It could still end up with proprietary chips, sure, but that can be manufactured by a third party, opening up all graphics solutions (Intel integrated, AMD, and Nvidia). Also keep in mind that this is a hardware logic block, so Nvidia can't bork G-Sync or an alternative using a software update.

My prediction is that Nvidia will only get a short time with this solution as an "exclusive" during the initial rollout, and that will see limited adoption because people will need to buy new monitors or televisions that actually have G-Sync modes. That gives AMD some time to tinker and see if they can exploit monitor capability, and LCD manufacturers will have time to experiment as well, perhaps coming up with alternative solutions of their own.
Being an adult doesn't mean you have to know what you're doing. It just means you have to look like you know what you're doing.
superjawes
Graphmaster Gerbil
Gold subscriber
 
 
Posts: 1027
Joined: Thu May 28, 2009 9:49 am

Re: G-Sync Monitors?

Postposted on Sun Oct 20, 2013 10:30 pm

Chrispy_ wrote:Seriously. Show me the bit where (after you had a ragefit) you found and posted a factual link to show that strobing backlights do anything other than fix sample-and-hold blur.
Of course that's all they fix. That's the blur. Once you fix that, combined with the 1ms response time of the display, there isn't any blur anymore. It's perfect, smooth motion, like that captured by a high-speed camera.
Chrispy_ wrote:Also your fundamental reasoning behind motion blur is completely wrong; motion blur effects in games do not blur moving objects that are static relative to your focal point. It seems you don't really understand how human vision works.
Objects that are static relative to your focal point DO often get blurred in games by "motion blur" effects, even though this is wrong. See Mirror's Edge, Lost Planet, Saint's Row, Dead Space.

More to the point that I was making, though, objects under your crosshair -- objects your character is clearly immediately focusing on -- also get blurred. This is unrealistic. Things in motion only get blurry in your vision when your eyes can no longer track them. As long as your eye can track it -- and your eye, or at least my eyes, can track pretty damn fast -- and you stay focused on it, it should stay clear.

Most of the picture on screen is going to be blurred by your eyes -anyway- as you look at it. Adding artificial motion blur, whether through inadequacies of your display, or through foolishly adding it in during the post-process phase, is just dumb.
i5-3570K @ 4.4 (NH-C14), 4x8GB DDR3-1866, GA-Z68MA-D3H-B2, ASUS GTXTITAN-6GD5, 128GB Vertex 4 / 2x60GB Vertex Plus R2 / 2x2TB Barracuda 7200.14 RAID0 / ANS-9010 (4x4GB), SST-DA1000 (PSU), 2x VS229H-P, 1x VG248QE, 1x MIMO 720F, Corsair Vengeance K90+M95
auxy
Gerbil Elite
 
Posts: 781
Joined: Sat Jan 19, 2013 4:25 pm
Location: the armpit of Texas

Re: G-Sync Monitors?

Postposted on Sun Oct 20, 2013 11:33 pm

Have Kepler... Will change.
Windows8.1 Pro 64 bit, Antec EA650 Power Supply, ASRock Extreme 4 motherboard, I5 3570K processor, 8 gigs of Kingston HyperX 1600 DDR3 ram, Kingston HyperX 3K 240 gig SSD, Asus GTX660, Cooler Master Storm Scout case
Pville_Piper
Gerbil First Class
 
Posts: 110
Joined: Tue Dec 25, 2012 2:36 pm

Next

Return to Visual Haven

Who is online

Users browsing this forum: No registered users and 6 guests