Personal computing discussed

Moderators: renee, Dposcorp, SpotTheCat

 
DeadOfKnight
Gerbil Elite
Topic Author
Posts: 726
Joined: Tue May 18, 2010 1:20 pm

G-Sync Monitors?

Sat Oct 19, 2013 10:33 am

http://rog.asus.com/267372013/gaming/nv ... -monitors/

What are everyone's thoughts on becoming an early adopter of this technology?
Intel Core i7-5775c, Asus Maximus VII Formula, Win 10 Pro
RTX 2080 Ti FE, Corsair 2x8GB DDR3-1866, Corsair AX860
Corsair H105, WD Red 4TB x2, Creative X-Fi Titanium HD
Samsung 850 EVO 1TB, Alienware AW3418DW, Corsair 450D
 
JohnC
Gerbil Jedi
Posts: 1924
Joined: Fri Jan 28, 2011 2:08 pm
Location: NY/NJ/FL

Re: G-Sync Monitors?

Sat Oct 19, 2013 10:48 am

As I said in comments - I would gladly switch my monitors if it does make a noticeable difference.
Gifter of Nvidia Titans and countless Twitch donation extraordinaire, nothing makes me more happy in life than randomly helping random people
 
LostCat
Minister of Gerbil Affairs
Posts: 2107
Joined: Thu Aug 26, 2004 6:18 am
Location: Earth

Re: G-Sync Monitors?

Sat Oct 19, 2013 11:01 am

I'd gladly switch if it doesn't cost me $140 over current ones...$40 maybe.
Meow.
 
Diplomacy42
Gerbil
Posts: 63
Joined: Sat Sep 01, 2012 2:56 pm

Re: G-Sync Monitors?

Sat Oct 19, 2013 11:16 am

Ambivilent. I don't think my GPU is even compatible, so that will need to come first.
This doesn't seem like the type of thing you should adopt early anyway, in a few years that chip might become practically standard, after-all its not like its a complicated chip.
 
JustAnEngineer
Gerbil God
Posts: 19673
Joined: Sat Jan 26, 2002 7:00 pm
Location: The Heart of Dixie

Re: G-Sync Monitors?

Sat Oct 19, 2013 11:17 am

If it can't match the image quality of my existing UltraSharp 3007WFP, I'm not changing.
· R7-5800X, Liquid Freezer II 280, RoG Strix X570-E, 64GiB PC4-28800, Suprim Liquid RTX4090, 2TB SX8200Pro +4TB S860 +NAS, Define 7 Compact, Super Flower SF-1000F14TP, S3220DGF +32UD99, FC900R OE, DeathAdder2
 
MadManOriginal
Gerbil Jedi
Posts: 1533
Joined: Wed Jan 30, 2002 7:00 pm
Location: In my head...

Re: G-Sync Monitors?

Sat Oct 19, 2013 11:26 am

I am hoping the DIY kits will be extended to other monitors by either NV or a third party. I don't want to have to compromise image quality by using a TN 'gaming monitor' which this will inevitably come out for stock.
 
JustAnEngineer
Gerbil God
Posts: 19673
Joined: Sat Jan 26, 2002 7:00 pm
Location: The Heart of Dixie

Re: G-Sync Monitors?

Sat Oct 19, 2013 11:42 am

http://techreport.com/news/25531/nvidia ... 033#769033
Chrispy_ wrote:
Here's a great example of why TN sucks.
+1.
· R7-5800X, Liquid Freezer II 280, RoG Strix X570-E, 64GiB PC4-28800, Suprim Liquid RTX4090, 2TB SX8200Pro +4TB S860 +NAS, Define 7 Compact, Super Flower SF-1000F14TP, S3220DGF +32UD99, FC900R OE, DeathAdder2
 
DeadOfKnight
Gerbil Elite
Topic Author
Posts: 726
Joined: Tue May 18, 2010 1:20 pm

Re: G-Sync Monitors?

Sat Oct 19, 2013 11:55 am

JustAnEngineer wrote:
http://techreport.com/news/25531/nvidia-g-sync-matches-lcd-refresh-rate-to-gpu-render-rate?post=769033#769033
Chrispy_ wrote:
Here's a great example of why TN sucks.
+1.

I would not suggest using a TN panel for everyday use. I am suggesting that for gaming it might be a great option. Still, no one has said that G-Sync will not be available on IPS monitors.
Last edited by DeadOfKnight on Sat Oct 19, 2013 12:02 pm, edited 2 times in total.
Intel Core i7-5775c, Asus Maximus VII Formula, Win 10 Pro
RTX 2080 Ti FE, Corsair 2x8GB DDR3-1866, Corsair AX860
Corsair H105, WD Red 4TB x2, Creative X-Fi Titanium HD
Samsung 850 EVO 1TB, Alienware AW3418DW, Corsair 450D
 
JustAnEngineer
Gerbil God
Posts: 19673
Joined: Sat Jan 26, 2002 7:00 pm
Location: The Heart of Dixie

Re: G-Sync Monitors?

Sat Oct 19, 2013 11:59 am

MadManOriginal said exactly that in the post to which I responded.
· R7-5800X, Liquid Freezer II 280, RoG Strix X570-E, 64GiB PC4-28800, Suprim Liquid RTX4090, 2TB SX8200Pro +4TB S860 +NAS, Define 7 Compact, Super Flower SF-1000F14TP, S3220DGF +32UD99, FC900R OE, DeathAdder2
 
DeadOfKnight
Gerbil Elite
Topic Author
Posts: 726
Joined: Tue May 18, 2010 1:20 pm

Re: G-Sync Monitors?

Sat Oct 19, 2013 12:03 pm

JustAnEngineer wrote:
MadManOriginal said exactly that in the post to which I responded.


Personally I'd like to see a 120Hz IPS screen. I'd also like to see a 23"-24" 1440p monitor. However, there is no denying the value of a variable refresh rate and no tearing. I might get one of these to pair with my IPS panel.

I'm one of those that turns v-sync on because tearing really bugs the crap out of me even more than lag and FPS jumping around, so maybe I'm an odd case. I wanted a second screen anyway and this is worth an extra $100.
Intel Core i7-5775c, Asus Maximus VII Formula, Win 10 Pro
RTX 2080 Ti FE, Corsair 2x8GB DDR3-1866, Corsair AX860
Corsair H105, WD Red 4TB x2, Creative X-Fi Titanium HD
Samsung 850 EVO 1TB, Alienware AW3418DW, Corsair 450D
 
derFunkenstein
Gerbil God
Posts: 25427
Joined: Fri Feb 21, 2003 9:13 pm
Location: Comin' to you directly from the Mothership

Re: G-Sync Monitors?

Sat Oct 19, 2013 2:04 pm

I would also love a 24" 1440p monitor, G-Sync or not. Though with, preferably.
I do not understand what I do. For what I want to do I do not do, but what I hate I do.
Twittering away the day at @TVsBen
 
DeadOfKnight
Gerbil Elite
Topic Author
Posts: 726
Joined: Tue May 18, 2010 1:20 pm

Re: G-Sync Monitors?

Sat Oct 19, 2013 2:20 pm

derFunkenstein wrote:
I would also love a 24" 1440p monitor, G-Sync or not. Though with, preferably.

A 24" 120Hz 1440p IPS monitor, with a GPU-agnostic G-Sync. I can dream.

These companies will tell us we want though. What do we know? Heh.
Intel Core i7-5775c, Asus Maximus VII Formula, Win 10 Pro
RTX 2080 Ti FE, Corsair 2x8GB DDR3-1866, Corsair AX860
Corsair H105, WD Red 4TB x2, Creative X-Fi Titanium HD
Samsung 850 EVO 1TB, Alienware AW3418DW, Corsair 450D
 
superjawes
Minister of Gerbil Affairs
Posts: 2475
Joined: Thu May 28, 2009 9:49 am

Re: G-Sync Monitors?

Sat Oct 19, 2013 3:18 pm

I would have if I had not just bought a 7950, probably. I was going to do a monitor upgrade in the next year or so, but I think I'll wait now until G-Sync gets more adoption and/or AMD offers an alternative.

At least this way I can upgrade my GPU in two years or so and get a G-Sync capable monitor at the same time.
On second thought, let's not go to TechReport. It's infested by crypto bull****.
 
Flatland_Spider
Graphmaster Gerbil
Posts: 1324
Joined: Mon Sep 13, 2004 8:33 pm

Re: G-Sync Monitors?

Sat Oct 19, 2013 4:08 pm

Yes, please. I'm already in the market for new 24"+ IPS panels, so if I can find a monitor that support this, I'll have to seriously consider it.

I've mentioned my opinion about how GUIs could be better with higher refresh rates and pixel densities in other threads and comments. Letting the video card drive the refresh rate of the monitor is a good idea, and now a blatantly obvious engineering decision.
 
JohnC
Gerbil Jedi
Posts: 1924
Joined: Fri Jan 28, 2011 2:08 pm
Location: NY/NJ/FL

Re: G-Sync Monitors?

Sat Oct 19, 2013 4:57 pm

Diplomacy42 wrote:
This doesn't seem like the type of thing you should adopt early anyway, in a few years...

:lol: Of course! If you're planning on living forever - I suggest just waiting for direct neural interface or something, you might save even more moneys that way :wink:

DeadOfKnight wrote:
I'm one of those that turns v-sync on because tearing really bugs the crap out of me even more than lag and FPS jumping around, so maybe I'm an odd case

Nah, you're not "odd" - I always force "Adaptive Vsync" in driver's control panel. I haven't seen any problems with that, even with bunch of kiddies shouting "OMG, VSYNC gives lags!" or "OMG, Adaptive Vsync gives tearing!!!!1111" on various FPS forums :wink:
Gifter of Nvidia Titans and countless Twitch donation extraordinaire, nothing makes me more happy in life than randomly helping random people
 
yammerpickle2
Gerbil
Posts: 13
Joined: Thu Apr 01, 2010 12:33 am

Re: G-Sync Monitors?

Sat Oct 19, 2013 5:00 pm

Doesn’t triple buffering basically do a similar thing? It just requires more memory on the card. So instead of upping memory on their cards, now we have to update all our monitors. Also suspect that the NVidia G-sync monitors will not play nice with AMD cards. Don’t think this tech is compatible with my three 580’s right now anyway so I'd need to update cards as well. Sounds like a pretty costly upgrade for minor improvement.
Waiting for the day when Maxwell based cards become available with the shared memory like the current crop of consoles have. By the time Maxwell cards are out 4K will be more mainstream and a bit cheaper and figured I’d go for the 4K display. I’ve been saving up for the massive cost of the monitor and the cards that will be required to drive it since I bought my 580’s. Plus by then maybe OLED may be coming on strong and they have an very high refresh rate.
I just wonder if the added price of the display and cards, it would be just better to invest that money on a better graphics card that could pump out higher frame rates.
 
auxy
Graphmaster Gerbil
Posts: 1300
Joined: Sat Jan 19, 2013 4:25 pm
Location: the armpit of Texas

Re: G-Sync Monitors?

Sat Oct 19, 2013 8:20 pm

JustAnEngineer wrote:
http://techreport.com/news/25531/nvidia-g-sync-matches-lcd-refresh-rate-to-gpu-render-rate?post=769033#769033
Chrispy_ wrote:
Here's a great example of why TN sucks.
+1.

Yah, my VG248QE actually shows arguably LESS warping of the lagom.nl test pattern than my eIPS VS229H-P side monitors, which have zero vertical gamma shift -- but a noticeable horizontal shift. Curious.

By the way, how does your monitor do on this test?.

Here, I took some pictures with my girlfriend's phone. They're pretty bad quality, but I took them from more or less exactly where I sit while using it. Have a look!:Picture 1, Picture 2 There's some shifting in gamma, but no color shift, and overall it's really not bad. Again, it's overall similar to the eIPS displays I have, which cost a lot less, but also don't do >60Hz or LightBoost (or DisplayPort!)
 
Chrispy_
Maximum Gerbil
Posts: 4670
Joined: Fri Apr 09, 2004 3:49 pm
Location: Europe, most frequently London.

Re: G-Sync Monitors?

Sat Oct 19, 2013 9:31 pm

60Hz, in situations where your graphics card can't output frames at a contant 60fps:
with Vsync on, your GPU finishes drawing a frame to the buffer ready for the monitor, and the monitor has to wait until it's next refresh (up to 1/60th of a second = 16.6ms) On average, that means you're getting an additional 8.3ms of lag at 60Hz (since, the delay will be somewhere between 0 and 16.6ms). In this example where the GPU can't fill the buffer fast enough for the 16.6ms refresh period, you will have to wait an additional 16.6ms, which means that you're actually averaging 25ms of input lag (16.6+8.3) and frame intervals of up to 33.3ms (1/30th of a second).

Now, it varies from person to person, but 1/30th of a second is too long to fool the brain into perceiving motion. It's still watchable at 30fps (movies are 24fps) but the brain struggles to fill in the gaps beween seperate images if there's too much motion and the difference between the two images is too great. "Fluidity" for most people is somewhere between 35Hz (29ms) and 50Hz (20ms). "Fluidity" as I'm calling it is NOT the same thing as the maximum refresh rate your eyes/brain can process. For me at least, that number is about three times greater.


120Hz, in situations where your graphics card can't output frames at a contant 60fps:
Your GPU finishes drawing a frame to the buffer ready for the monitor, and the monitor has to wait until it's next refresh (up to 1/120th of a second = 8.3ms) The improved refresh rate means you're averaging 4.2ms of lag from the refresh interval alone, and another 8.3ms if the GPU skips a frame because it can't keep up - bringing the average input lag down to 12.5ms (8.3+4.2).

Here is where it gets funky; At exactly 60fps, we're missing every other refresh to deliver frames every 16.6ms but at less than a constant 60fps the frame takes more than 16.6ms. In fact, the GPU has up to 25ms to draw to the buffer before missing the refresh for a second time, rather than 33.3ms before the second refresh when running the monitor at only 60Hz. the shorter interval between second refresh reduces the missed-interval penalty by 8.3ms, and bringing the longest frame interval down to 25ms, which is important because it's the number that falls comfortably within the 35-50Hz range that most people consider to be fluid.


G-Sync, in situations where your graphics card can't output frames at a contant 60fps:
Your GPU finishes drawing a frame to the buffer ready for the monitor, and the monitor doesn't have to wait at all. The input lag in the system is entirely the game engine and the rendering time, which will always be present, plus any input lag the display processing the image causes. These types of lag are always present though and the context we're talking about - lag caused by vsync and refresh intervals- the value is zero. NO LAG. Using the median 25ms value that can be considered "fluid" by most people, the graphics card can deliver framerates as low as 40fps and still look good, still with ZERO LAG.


G-Sync in a nutshell:
Well, even at <60fps G-Sync does two things: It reduces the input lag, and increases the amount of time a GPU can spend rendering a frame before frames exceed the 25ms threshold that we (I, at least) don't like. It brings the fluidity benefits of a 120Hz display to sub-60fps gaming and it manages to reduce imput lag to the same level as without vsync. Playing with Vsync turned off, but cranking detail levels high is like an exercise in pointlessness. Why go to all the effort of making it look pretty if the monitor rips the dolled-up frame clean in two the minute there's any motion?

Nvidia have worked it out (which is annoying, because Nvidia like to keep things to themselves rather than drive open standards for the industry): We don't need ludicrious 120, 144 or even 240Hz monitors. What we need is as much detail as possible at a framerate that's fluid. Given the choice of gaming with 'ultra settings' at 50Hz (a constant 50fps) or 'medium settings' at 144Hz, I'd definitely go with the higher detail at the lower framerate. Once a game looks fluid, cranking out the frames faster is wasted effort that could be better spent on making the frames prettier/more detailed.


Off topic a bit:
I have actually measured the rate at which I perceive fluidity and it's 43Hz for me. I won't bore you with the science of how that was calculated, but it's important to know that it is an average; The brain takes longer to process and low-contrast and darker images, In those instances, 30Hz can seem fluid. In candyland-style brightly contrasting images, 60Hz may not be enough. This is arguably why flicker-free monitors required at least 72Hz. The strobing CRT scan was effectively a very bright, very high contrast white band moving down the screen - the worst case scenario of bright and contrast-y requiring higher frequencies to fool the brain. I digress though, I am very happy with my screen running at 85Hz. For the same reason as a 120Hz screen, it brings the dropped-frame interval from 33,3ms down to just 23.5ms - and that happens to be about 43Hz, just about enough to fool my brain. It's obviously not a constant 60fps, but it's better than the yo-yoing between 60fps and 30fps every few frames.

Oh, and congratulations if you read this far. I think I've worked off my foolish alcohol-fueled insomnia, which was the main point of this post ;)
Congratulations, you've noticed that this year's signature is based on outdated internet memes; CLICK HERE NOW to experience this unforgettable phenomenon. This sentence is just filler and as irrelevant as my signature.
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: G-Sync Monitors?

Sat Oct 19, 2013 10:44 pm

Alcohol always makes forum posts better :).

But you're right, though I needed at least 85Hz on CRTs to eliminate 'flashing'. I could readily tell you if a CRT was running at 60Hz, 75Hz, or 85Hz- and I could still see improvements at 100Hz and 120Hz. Good fun.

Thankfully, LCD's present an image that doesn't have to constantly 'refresh'.
 
auxy
Graphmaster Gerbil
Posts: 1300
Joined: Sat Jan 19, 2013 4:25 pm
Location: the armpit of Texas

Re: G-Sync Monitors?

Sun Oct 20, 2013 12:25 am

Chrispy_ wrote:
This is arguably why flicker-free monitors required at least 72Hz.
Airmantharp wrote:
But you're right, though I needed at least 85Hz on CRTs to eliminate 'flashing'. I could readily tell you if a CRT was running at 60Hz, 75Hz, or 85Hz- and I could still see improvements at 100Hz and 120Hz. Good fun.
I couldn't see flashing on a 60hz CRT. Still can't, in fact; I have one hooked up to my newly-built Linux gateway -- an old IBM Aptiva CRT monitor. I *could* tell you, in a game, approximately what the MaxFPS was set to, with the display set to 120Hz refresh, though; we tested that quite thoroughly. I guess my eyes are tuned for high sensitivity to fast motion with no sensitivity to flicker.
Airmantharp wrote:
Thankfully, LCDs present an image that doesn't have to constantly 'refresh'.
I'm not all that thankful for it. Strobing does resolve the problem, though.

To be on topic, I can't really get excited about Gsync. Unless I'm totally wrong and the improvement in fluid motion and CLEAR motion is better than that from Lightboost, the exclusivity between Gsync and Lightboost means I'm always going to choose the latter.

It's a really neat concept and all, but I don't play games under 60fps anyway, so it doesn't matter that much to me.

By the way, for those of you hoping this will work or 'get hacked to work' with AMD cards, it's somewhat unlikely --
PC Perspective wrote:
This technology will be available soon on Kepler-based GeForce graphics cards but will require a monitor with support for G-Sync; not just any display will work. The first launch monitor is a variation on the very popular 144 Hz ASUS VG248QE 1920x1080 display and as we saw with 3D Vision, supporting G-Sync will require licensing and hardware changes. In fact, NVIDIA claims that the new logic inside the panels controller is NVIDIA's own design - so you can obviously expect this to only function with NVIDIA GPUs.
That was mostly folks in the comment thread, not in here, but it bears spreading the word. G-Sync requires custom logic inside the monitor and I wouldn't put it past NVIDIA to check some kind of vendor ID in the GPU.
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: G-Sync Monitors?

Sun Oct 20, 2013 1:06 am

They also stated that on the GPU side that it's simply a modification of the DP packets, so assuming that it's not somehow artificially locked down, it could be implemented in other GPUs in the future, and possibly hacked (officially or not) into current GPUs not made by Nvidia.

As for the usefulness vs. Lightboost- I guess it depends. Neither are terribly likely to arrive on a monitor I'd actually be interested (large, high-DPI, color accurate...), but I think I'd rather have G-Sync, overall. Lightboost does nothing for input lag or tearing, it just attacks perceived panel ghosting, which is less useful to me.
 
Chrispy_
Maximum Gerbil
Posts: 4670
Joined: Fri Apr 09, 2004 3:49 pm
Location: Europe, most frequently London.

Re: G-Sync Monitors?

Sun Oct 20, 2013 5:49 am

Strobing backlights are great, but they only fix the sample-and-hold motion blur. Even a "slow" 6ms panel exhibits less visible ghosting than the 1-frame blur of sample-and-hold.

If we were still playing fast platform scrollers like Sonic the Hedgehog all day, this would be essential, but most games actually intentionally blur movement for "realism" because this is how our eyes work, making strobes a nice theoretical but barely relevant feature to modern gaming.
Congratulations, you've noticed that this year's signature is based on outdated internet memes; CLICK HERE NOW to experience this unforgettable phenomenon. This sentence is just filler and as irrelevant as my signature.
 
DeadOfKnight
Gerbil Elite
Topic Author
Posts: 726
Joined: Tue May 18, 2010 1:20 pm

Re: G-Sync Monitors?

Sun Oct 20, 2013 5:51 am

auxy wrote:
To be on topic, I can't really get excited about Gsync. Unless I'm totally wrong and the improvement in fluid motion and CLEAR motion is better than that from Lightboost, the exclusivity between Gsync and Lightboost means I'm always going to choose the latter.

I was watching the LinusTechTips Twitch livestream of this and John Carmack was talking about using both of them together. Then he got a "look" from the Nvidia guys and admitted that this was not yet a supported feature, but he seemed excited at the prospect of using them both together. Basically, it sounds to me like this technology still has room to mature, so being an early adopter may or may not be the best idea. Still very very exciting stuff though, in my opinion.
Intel Core i7-5775c, Asus Maximus VII Formula, Win 10 Pro
RTX 2080 Ti FE, Corsair 2x8GB DDR3-1866, Corsair AX860
Corsair H105, WD Red 4TB x2, Creative X-Fi Titanium HD
Samsung 850 EVO 1TB, Alienware AW3418DW, Corsair 450D
 
auxy
Graphmaster Gerbil
Posts: 1300
Joined: Sat Jan 19, 2013 4:25 pm
Location: the armpit of Texas

Re: G-Sync Monitors?

Sun Oct 20, 2013 6:57 am

Chrispy_ wrote:
Strobing backlights are great, but they only fix the sample-and-hold motion blur. Even a "slow" 6ms panel exhibits less visible ghosting than the 1-frame blur of sample-and-hold.

If we were still playing fast platform scrollers like Sonic the Hedgehog all day, this would be essential, but most games actually intentionally blur movement for "realism" because this is how our eyes work, making strobes a nice theoretical but barely relevant feature to modern gaming.
You are -- I can't -- this is ridiculous. If you said this to my face, I would jump so I could reach and knock you outbop your nose with a squeaky mallet. You might as well have just said "Nightvision is a cool theoretical but barely relevant feature to modern warfare." No, it's a huge advantage, and so is Lightboost. ヽ(`Д´#)ノ

Games implement motion blur because they refuse to run at 60hz on consoles, instead preferring to shoot for 25-30FPS to make for nicer screenshots. 25-30FPS looks awful, so they use motion blur to try and hide the effect. It still looks ridiculous and unrealistic, because it IS unrealistic. Your eyes blur moving images when they aren't focused on them. Artificially blurring an image which you ARE focused on is distracting and counterproductive. All pro-gamers turn off motion blur. The image on screen should stay clear and steady as it does in real life, and Lightboost allows this to happen.

It's clear as day -- compare a screen at 120hz, then the same screen at 120Hz + Lightboost. I've had the fortunate situation to compare them with the same video side-by-side, and it's not even a comparison. It's not even worth talking about. Once you see it, everything else looks like stuttery garbage.

Go play a game with Lightboost enabled and then come back and tell me this. I can't believe your gall. I'm so furious right now! (ノ`Д´)ノ彡┻━┻
 
vargis14
Gerbil Jedi
Posts: 1900
Joined: Fri Aug 20, 2010 6:03 pm
Location: philly suburbs

Re: G-Sync Monitors?

Sun Oct 20, 2013 7:18 am

I am using a 37" vizio TV and I do not think it is a TN panel....I have a 47" vizio TV hooked to another computer and I do not believe it is a TN panel either.

I am thinking IPS with big pixels but I could be wrong...but the color seems correct , does not shift and they have good viewing angles.

The main thing besides having a decent picture......neither of them have any noticeable LAG of any kind. Nor does my plasma.

Hey I have noticed a Huge increase in quality watching my movies and tv shows @ 60FPS over 24fps......when the wife notices the difference you know it is doing something good. Thanks to the Smooth Video project....on another note turn everything up to the max on SVP and it will bring my i7 2600k to its knees...I think I would have been ok with a overclock but i just transplanted it into my 7750 h67 chipset and that bios wont let me do anything!! I had a i3 2125 in my 560ti SLI rig and it felt fine. Now mt 2600k and 2125 are in the correct homes, running a 2600k at stock speeds is just wrong. 4.6ghz with 1.350 volts is ok until I get a 240mm rad. My 120mm is doing a good job"even caked with a good bit of dust. I know ill vacuum the rad...I just wanted to get my 2600k in its correct home.

Ohh it was time for a paste change temps have dropped considerably 5-10c on both machines with artic silver 4 or 5 i forget but i need to get more TIM. This tube I used I have no Idea where and when it was from! Since i had a tube of white ceramic paste also from my fx-53 940 pin days.
2600k@4848mhz @1.4v CM Nepton40XL 16gb Ram 2x EVGA GTX770 4gb Classified cards in SLI@1280mhz Stock boost on a GAP67-UD4-B3, SBlaster Z powered by TX-850 PSU pushing a 34" LG 21/9 3440-1440 IPS panel. Pieced together 2.1 sound system
 
Airmantharp
Emperor Gerbilius I
Posts: 6192
Joined: Fri Oct 15, 2004 10:41 pm

Re: G-Sync Monitors?

Sun Oct 20, 2013 8:46 am

auxy, I'm planning a trip west, to shoot landscapes of west Texas and to meet a few people along the way. I think I need to stop by and buy you lunch, if only so I can understand your emoticons :).
 
Chrispy_
Maximum Gerbil
Posts: 4670
Joined: Fri Apr 09, 2004 3:49 pm
Location: Europe, most frequently London.

Re: G-Sync Monitors?

Sun Oct 20, 2013 10:00 am

auxy wrote:
Chrispy_ wrote:
Strobing backlights are great, but they only fix the sample-and-hold motion blur
RAAAAAAaaaaaaaggggggee


Seriously. Show me the bit where (after you had a ragefit) you found and posted a factual link to show that strobing backlights do anything other than fix sample-and-hold blur.

Also your fundamental reasoning behind motion blur is completely wrong; motion blur effects in games do not blur moving objects that are static relative to your focal point. It seems you don't really understand how human vision works.
Congratulations, you've noticed that this year's signature is based on outdated internet memes; CLICK HERE NOW to experience this unforgettable phenomenon. This sentence is just filler and as irrelevant as my signature.
 
superjawes
Minister of Gerbil Affairs
Posts: 2475
Joined: Thu May 28, 2009 9:49 am

Re: G-Sync Monitors?

Sun Oct 20, 2013 1:03 pm

auxy wrote:
...it bears spreading the word. G-Sync requires custom logic inside the monitor and I wouldn't put it past NVIDIA to check some kind of vendor ID in the GPU.

Well, couple notes on "custom logic":

If it does end up 100% proprietary to Nvidia, someone like AMD could probably license it and access the internal logic to enable G-Sync on an AMD card. That sucks for AMD and consumers, but the important part is not the GPU and the system overall, but getting the monitor into a state where the GPU can control the refresh rate, essentially putting the monitor into a "refresh off" state with a "refresh enable" signal triggering a scan and displaying the new frame. As long as a monitor has that, AMD can set up their own logic through drivers and/or architecture for their own solution.

However, I do not think that the logic is terribly complicated, and now that the idea is out there, monitor venders really only need to offer a way to trigger this "refresh off" state that GPUs can tap into. It could still end up with proprietary chips, sure, but that can be manufactured by a third party, opening up all graphics solutions (Intel integrated, AMD, and Nvidia). Also keep in mind that this is a hardware logic block, so Nvidia can't bork G-Sync or an alternative using a software update.

My prediction is that Nvidia will only get a short time with this solution as an "exclusive" during the initial rollout, and that will see limited adoption because people will need to buy new monitors or televisions that actually have G-Sync modes. That gives AMD some time to tinker and see if they can exploit monitor capability, and LCD manufacturers will have time to experiment as well, perhaps coming up with alternative solutions of their own.
On second thought, let's not go to TechReport. It's infested by crypto bull****.
 
auxy
Graphmaster Gerbil
Posts: 1300
Joined: Sat Jan 19, 2013 4:25 pm
Location: the armpit of Texas

Re: G-Sync Monitors?

Sun Oct 20, 2013 10:30 pm

Chrispy_ wrote:
Seriously. Show me the bit where (after you had a ragefit) you found and posted a factual link to show that strobing backlights do anything other than fix sample-and-hold blur.
Of course that's all they fix. That's the blur. Once you fix that, combined with the 1ms response time of the display, there isn't any blur anymore. It's perfect, smooth motion, like that captured by a high-speed camera.
Chrispy_ wrote:
Also your fundamental reasoning behind motion blur is completely wrong; motion blur effects in games do not blur moving objects that are static relative to your focal point. It seems you don't really understand how human vision works.
Objects that are static relative to your focal point DO often get blurred in games by "motion blur" effects, even though this is wrong. See Mirror's Edge, Lost Planet, Saint's Row, Dead Space.

More to the point that I was making, though, objects under your crosshair -- objects your character is clearly immediately focusing on -- also get blurred. This is unrealistic. Things in motion only get blurry in your vision when your eyes can no longer track them. As long as your eye can track it -- and your eye, or at least my eyes, can track pretty damn fast -- and you stay focused on it, it should stay clear.

Most of the picture on screen is going to be blurred by your eyes -anyway- as you look at it. Adding artificial motion blur, whether through inadequacies of your display, or through foolishly adding it in during the post-process phase, is just dumb.
 
Pville_Piper
Gerbil XP
Posts: 347
Joined: Tue Dec 25, 2012 2:36 pm
Location: Pville...

Re: G-Sync Monitors?

Sun Oct 20, 2013 11:33 pm

Have Kepler... Will change.
Windows10, EVGA G2 750w Power Supply, Acer XB270H G-synch monitor, MSI Krait Gaming 3X, I7 6700K, 16 gigs of CORSAIR Vengeance LPX DDR4 3200 MHz ram, Crucial 500 gig SSD, EVGA GTX1080 FTW

Who is online

Users browsing this forum: No registered users and 1 guest
GZIP: On