G-Sync Impressions

From the pixels, bits, and shaders to the graphic cards that power them. Discuss the latest from AMD and NVIDIA here.

Moderators: morphine, SecretSquirrel

G-Sync Impressions

Postposted on Tue Dec 31, 2013 11:36 pm

Hey guys!

I was one of the 5 lucky people selected to get a free G-Sync kit and I'd like to share my impressions. I'd like to thank the great people here at TechReport for having this giveaway and the also the folks at Nvidia for making it all happen.

My thoughts before G-Sync:

I was initially using the StrobeLight 120Hz mod (http://www.blurbusters.com/), effectively taking away almost all motion blur. My GTX 780 was also keeping me at a solid 120 FPS in almost every game. I was incredibly happy with my current setup, but to say that it was completely blurless would be a lie. It was good, but not perfect. Then came G-Sync. It was really hard for me to get a good idea of the real effects of G-Sync. There is no easy way to capture what G-Sync does without actually being able to see it. At first, I'd say I was skeptical since I had to believe what the media had reported. It sounded good, but did it really perform as well as they said? Who knows, we just had to believe them since we couldn't actually see what G-Sync does.

My thoughts after G-Sync:

This stuff is the real deal. Being an avid FPS player I quickly fired up my favorite competitive FPS: CounterStrike Global Offensive. The game was nothing like I remember after installing G-Sync. It felt like I was gliding. Everything was smooth as butter. Of course, the game isn't very demanding; with Strobelight I would still get tearing when turning corners fast. With G-Sync, absolutely no tearing and the visual quality was spectacular.

Then I fired up Metro Last Light. This is the game where you can easily see how well G-Sync can work and its limitations. With highest settings except for SSAA enabled, there was 0 tearing and silky smooth picture. I'd stay at 60fps no problem. But, once I enabled SSAA and my FPS started dipping into the 30fps range, the effects of G-Sync disappeared. It was like running on a normal setup. Tons of stutter and tearing. Of course, this is to be expected since all the reviewers mention that.

Installation:

According to Nvidia they say the process should take no more 30 minutes. It took me around 90 minutes. I was extra careful at every step and it was my first time opening up the monitor, so I'd say 60 minutes is a better estimate. The majority of people will be doing it their first time as well. The instructions were pretty clear, although there are some discrepancies. The board in the pictures is an earlier revision (the one Scott, Anand, etc got) and some of the connector spots are a little different, but in the end it isn't anything too confusing.

Conclusion:

G-Sync is a huge step up, there's no doubting that. The only concern I have is the rumored $200 price for the DIY kit. If you get it pre-installed with the monitor the kit comes out to $120-$150, depending on the price at the time of the VG248QE. G-Sync requires a powerful video card to take advantage of its benefits. Running highest settings at 45 FPS and it feeling like you are running at 144Hz with no tearing or blur is just insane. I'm really interested to see how G-Sync evolves.

Once again, thank you fellas at TR and Happy New Year to everyone!
xonar
Gerbil In Training
 
Posts: 5
Joined: Tue Dec 17, 2013 6:46 pm

Re: G-Sync Impressions

Postposted on Wed Jan 01, 2014 12:35 am

My boyfriend, dreamss, was another of the lucky winners, and since I had the pleasure of installing the kit today, I thought I'd chime in as well. First, let me say that I haven't played anything on the monitor yet, he's been at it for a while now and I'm not about to take his toy away from him. These are my first impressions on the DIY kit and basic setup only.

First up, I loved that the manual was very thorough. I've already had our monitor apart before, to see what a kit like this would entail, but with this manual it's really something most people should be capable of doing themselves. Big thumbs up on the manual, especially considering the DIY kit is such a niche market - this could've easily just pointed to a website for instructions. Also bonus points for a g-sync sticker designed to fit on the monitor bezel - I normally remove all of this junk, but I'm sure some people will like to show off. The install itself was very straight-forward, and I finished in something like 45 minutes, I wasn't paying attention that much as I was watching tv at the time.

I do like that the kit included blockoff plates for the now-empty display connectors and smaller power connector, this is something that wasn't pictured in the review copy teardown. However, maybe it's just me, but I could find no way to actually attach the power blockoff to anything - it lacked the clips that the original had. Another interesting thing I noted was what appears to be an HDMI connector, just missing the actual connector. You can see it here, it lines up perfectly with the old HDMI and appears to be the same layout. I'll probably solder an HDMI port on it at some point in the future if no one else pokes at it.

The kit itself looks like it has plenty of modding potential, there's a second backlight connector missing the header and likely a matching capacitor nearby. The jumper seen in the preview models seems missing, though the pcb is still labeled for it (12v / 5v panel selection). The actual LVDS connectors are 180 degrees from the original board, which was unexpected and seems like a silly thing to do considering the massive amount of pcb space where they could've routed things however. The header in the middle of the board has the potential to be a jtag interface, you can bet I'll be poking at that as time goes on as well.

Anyway, this is an album of the various pictures we took while installing it. Not a complete guide by any means, but should give people a rough idea at least. Some things to note: lightboost now essentially has it's own dedicated hardware button to turn it off and on, as long as you don't have gsync going at the same time - this is way nicer than random lightboost hacks. You do lose everything associated with asus - there's virtually no OSD, no color controls, no more crosshair overlays, etc.
bakageta
Gerbil In Training
 
Posts: 4
Joined: Wed Jan 01, 2014 12:10 am

G-Sync Unboxing, Installation, and Impressions

Postposted on Wed Jan 01, 2014 1:50 am

Hello folks,

I'm starting off with just posting just the unboxing pics. I'll then install it and try it out later today (New Years Day). For now enjoy:

UPDATE: Here's a link to the full album http://s145.photobucket.com/user/winglesshuman/library/G-Sync

The installation was rather straight forward thanks to the highly detailed, step-by-step installation guide. So far I've tried CS:GO, Metro:LL, Planetside 2, Battlefield 4, Warframe, and Total War: SHOGUN 2. The results are exactly as expected!
There is no tearing at all EVER with V-sync off. The gameplay feels smooth at 40FPS and higher so you can crank the settings. 40FPS feels almost like 60FPS to the eyes when playing so that made me feel all warm and fuzzy inside. In games that run at 100+FPS everything is buttery. This tech should be in every Nvidia-branded, 3D-vision-ready monitor on the market ASAP. I'll be getting a 3D Vision 2 kit in a few weeks so I'll chime back in then.

Are there any questions you'd like to ask?

Image

Image

Image

Image

Image

Image

Image

Image

Image

Image

Image
Last edited by wingless on Wed Jan 01, 2014 1:06 pm, edited 1 time in total.
Intel Core i7 2600K | 16GB DDR3-2133 | ASUS P8Z77-V Pro | PCP&C 750w | Gigabyte GTX 760 | Fatality X-Fi | Samsung 840 Pro | RAID0 2xWD 750GB Caviar Blacks | TV Wonder HD 650 PCIe
wingless
Gerbil XP
 
Posts: 340
Joined: Sun Jun 10, 2007 6:38 pm
Location: Houston, TX

Re: G-Sync Unboxing, Installation, and Impressions

Postposted on Wed Jan 01, 2014 1:53 am

*RESERVED*
Intel Core i7 2600K | 16GB DDR3-2133 | ASUS P8Z77-V Pro | PCP&C 750w | Gigabyte GTX 760 | Fatality X-Fi | Samsung 840 Pro | RAID0 2xWD 750GB Caviar Blacks | TV Wonder HD 650 PCIe
wingless
Gerbil XP
 
Posts: 340
Joined: Sun Jun 10, 2007 6:38 pm
Location: Houston, TX

Re: G-Sync Impressions

Postposted on Wed Jan 01, 2014 10:18 am

Thanks for your impressions as well.

Those were the discrepancies I was talking about. The pictures show an HDMI port and a different metal plate for the HDMI/DP board. The one we got only had 1 screw for the DP port. The 2nd screw would not go in because there was no HDMI port for the screw to attach to. Also, the manual mentions "reuse the 5 screws when reassembling", but there are 6 screw holes and 2 of the screw holes would not budge for me, so I only have 4 screws in. The power connector backplate also had no "lip" to hook onto the chassis like the original Asus one did, so I taped mine on.

Great pics, hope you get to enjoy the g-sync goodness soon!
xonar
Gerbil In Training
 
Posts: 5
Joined: Tue Dec 17, 2013 6:46 pm

Re: G-Sync Impressions

Postposted on Wed Jan 01, 2014 1:41 pm

I don't know why I didn't think about taping the power blockout in place, guess I must've just been in a hurry to get it together. I'm sure I'll have it apart again at some point in the future, and that's a great idea. I had a chance to finally play with things, as dreamss wore himself out, so some quick thoughts on it: Anything that ran in the 45-75 fps range looks much MUCH smoother with g-sync on, I wouldn't have noticed the variance without an onscreen fps counter and that's normally an area where I can really tell a difference. Games where we held 100-140 fps I didn't see a ton of benefit from gsync, though, and found were more enjoyable with lightboost. Lightboost mode is now called ULMB in the OSD, short for ultra-low motion blur I believe. Compared to the lightboost hack that we had ran before, ULMB looks to be much less washed out while still eliminating a ton of motion blur. I don't have a high-speed camera so I could record it and compare to the traditional lightboost hacks, but I eagerly await the lift of the media blackout so blurbusters can say whether I'm just biased because it's a new toy or if they actually improved it.

I managed to get 5 of the 6 screws in, mostly because we somehow fumbled the 6th. The 2 with the wider heads go near the display port side, while the 4 brass (I think?) ones with a smaller head go on the power side. The holes didn't quite line up, I had to pull the board off and tweak on the standoffs a couple of times to get everything lined up. It says they shouldn't screw all the way in, just partially, and that also matched what I ended up with - they just stopped about 3/4ths of the way in. The power button ribbon cable was one of the more tedious aspects of the install, they really should've located that port a bit further out.
bakageta
Gerbil In Training
 
Posts: 4
Joined: Wed Jan 01, 2014 12:10 am

Re: G-Sync Impressions

Postposted on Wed Jan 01, 2014 6:29 pm

The alignment of the motherboard inside the metal chassis was not perfect for me either. That's why I could only align 4 screws without applying any pressure. I even tried ~5 times to get the 5th one to go through but as soon as I put one screw in (didn't matter what hole it was first) the board would lose its alignment and the other holes would be off. But, you really only need 2 screws to keep the thing in solid, so 4 seems alright with me. That blue ribbon cable port is a joke... it's literally 4-5'' inside the metal cage haha. I also had a rough time getting the power and backlight cables in and out. The connectors could be a bit more finger friendly.

I agree with you on how G-Sync feels. While it "seems" roughly the same as LightBoost with 120FPS+, I did notice much less tearing. In games with fast movements and rigid corners of buildings, I'd notice tearing if I was paying attention closely with LightBoost. With G-Sync I only noticed it like 5% of the time I did with LightBoost. The problem with noticing G-Syncs benefits is that we are so accustomed to stuttering and tearing that we just learn to ignore it. I played around with G-Sync enabled/disabled as there really is a noticeable difference. I'm mostly loving the fact that I can run at highest settings and float in the 40-60 FPS range and it looks and feels like true 144Hz motions to me.
xonar
Gerbil In Training
 
Posts: 5
Joined: Tue Dec 17, 2013 6:46 pm

Re: G-Sync Impressions

Postposted on Wed Jan 01, 2014 10:39 pm

We got feedback for nvidia, where we can leave it?

Suggestions for people getting into g-sync:

Notes:
    ULMB is the now improved lightboost mode, colors look much better.

    DO NOT install the toastyx lightboost hack, it will disable any mention of Gsync in the nvidia control panel. to unistall it simply run strobelight-setup.exe and click on "Reset this display (uninstall)" checkbox and then on "Reset this display" button.

    Gsync with "Highest avalieble Perfered refresh rate" setting will limit framerate to your refresh rate (144hz/144Fps with this setting), this is why I recommend people to disable Gsync on games that can run at higher framerate. While windowed games wont benefit from Gsync, they can still benefit from ULMB mode.

120hz With ULMB on outside games, with 144hz G-sync forced on all games, and for games that run at more than 144 fps you can disable gsync to have ULMB
Image
Image
Image
dreamsss
Gerbil In Training
 
Posts: 2
Joined: Mon Dec 16, 2013 8:14 pm

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 2:13 am

xonar wrote:This stuff is the real deal. Being an avid FPS player I quickly fired up my favorite competitive FPS: CounterStrike Global Offensive. The game was nothing like I remember after installing G-Sync. It felt like I was gliding. Everything was smooth as butter. Of course, the game isn't very demanding; with Strobelight I would still get tearing when turning corners fast. With G-Sync, absolutely no tearing and the visual quality was spectacular.

Then I fired up Metro Last Light. This is the game where you can easily see how well G-Sync can work and its limitations. With highest settings except for SSAA enabled, there was 0 tearing and silky smooth picture. I'd stay at 60fps no problem. But, once I enabled SSAA and my FPS started dipping into the 30fps range, the effects of G-Sync disappeared. It was like running on a normal setup. Tons of stutter and tearing. Of course, this is to be expected since all the reviewers mention that.


So, I pried dreamss away from it for a bit and had a chance to poke at things, and I have to agree - gsync is very impressive, moreso than I would've imagined without seeing it first hand. 30 fps does seem to be the magical number where it stops working well, but as long as the game stayed above 40 fps I couldn't tell what framerate it was running at offhand - everything from 40 to 90 fps seemed buttery smooth with zero tearing. This was pretty surprising to me, as with the monitor on 144hz before, it wasn't until ~75+ fps that I couldn't tell any difference in framerate, and even then there was still tearing. Prior to g-sync, it was really obvious when we'd dip under 60fps, and it wasn't until I actually pulled up evga precision on the second screen to see the actual fps that I believe it was as low as it was, things were that smooth.

We're driving it with a single GTX 680 right now, and basically everything we threw at it was able to hold 40+ fps on the highest settings, making for a wonderful experience. We have a second GTX 680 on the way so that we can see how it plays with SLI microstutter and higher FPS, but from my first impressions, g-sync does an amazing job of giving you smooth gameplay without requiring a system that'll do 120+ fps at all times. I'm incredibly jealous now, I want one on my pc.
bakageta
Gerbil In Training
 
Posts: 4
Joined: Wed Jan 01, 2014 12:10 am

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 8:05 am

I look forward to when this tech becomes open and mainstream enough to make it to a standard display interface.

At the moment I don't think I could go back to 6-bit wishy-washy TN panels when I normally game on a 50" 100Hz MVA samsung panel with inky blacks, no viewing angle issues and incredible colours.

I do however applaud you guys for being the beta testers and hope that this tech makes it to the mainstream. It would be a shame if it were stunted by some proprietary greediness/stupidity like Nvidia did to PhysX, 3DVision or making streaming to Shield a GTX-only thing....
<insert large, flashing, epileptic-fit-inducing signature (based on the latest internet-meme) here>
Chrispy_
Gerbil Jedi
Gold subscriber
 
 
Posts: 1885
Joined: Fri Apr 09, 2004 3:49 pm

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 10:09 am

xonar wrote: G-Sync requires a powerful video card to take advantage of its benefits.


After reading yours and bakagetas' comments I don't agree. If you set up any video card to run at the highest settings that it can and still keep your frame rate above 40 fps than anybody will see a benefit. For instance, my GTX660 will allow me to run my settings on HIGH in BF4 and maintain 60 to 90 fps. With G-Synch I could enable most of the ULTRA settings and still stay above 40 fps. Or I could just leave them where they are an get rid of the tearing. Heck, with BF3 I could go full ULTRA and still see a very fluid game play.
So I guess that my questions are answered, I do want to get a G-Synch monitor. I will just have to wait until the monitors come pre-built from Asus in order to afford it. I wish the kits were going to be around $100 then I would get a monitor now and upgrade later but for now I will wait and see where the price falls.
Windows8.1 Pro 64 bit, Antec EA650 Power Supply, ASRock Extreme 4 motherboard, I5 3570K processor, 8 gigs of Kingston HyperX 1600 DDR3 ram, Kingston HyperX 3K 240 gig SSD, Asus GTX660, Cooler Master Storm Scout case
Pville_Piper
Gerbil First Class
 
Posts: 111
Joined: Tue Dec 25, 2012 2:36 pm

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 10:27 am

You know I was also thinking about Mantle and its' benefits. But I have to ask, what benefit is higher frame rates if you still have tearing. In FPS, a game that really benefits from high frame rates, you would have to enable V-Synch which then introduces lag. The lag in some games is at times horrible, so much so that my KDR drops dramatically. Enabling V-Synch is pretty much a death wish for me... But everything looks sooooooooooooooooooo much better when I'm dying!
C'mon G-Synch!
Windows8.1 Pro 64 bit, Antec EA650 Power Supply, ASRock Extreme 4 motherboard, I5 3570K processor, 8 gigs of Kingston HyperX 1600 DDR3 ram, Kingston HyperX 3K 240 gig SSD, Asus GTX660, Cooler Master Storm Scout case
Pville_Piper
Gerbil First Class
 
Posts: 111
Joined: Tue Dec 25, 2012 2:36 pm

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 10:46 am

I am one jealous gerbil. Sounds like you all are having a blast.

To comment on the 30 FPS threshold, I believe that 30 Hz is the lowest possible refresh rate when using G-Sync, so you would start to see old issues pop up down there because the refresh is forced (much like how all refreshes are forced on non-G-Sync monitors). For GPU power, you should only need enough to stay above this threshold to see the smooth, buttery goodness as the smoothness comes from minimizing redundant data at your eyes. VSync displays an old frame when a new one isn't ready, but G-Sync avoids this because the monitor doesn't have to refresh at a fixed interval. I've wondered if there might still be some weirdness with uneven exposure (time your eye sees the frame), but it sounds like that effect is minimal.

dreamsss, thanks for the screenshots. I don't think you meant to show it, but it appears that it is perfectly possible to keep G-Sync on when a non-G-Sync monitor is also connected to the video card.

Also, (for everyone concerned) don't worry about the cost of the kits and monitors too much. The mod kit isn't a permanent solution. It includes an FPGA (field programmable gate array), which is basically a reprogrammable (and expensive) circuit. Nvidia should be able to transition into ASICs (application specific integrated circuits) for the technology, which would virtually eliminate the price premium. There will be an early adopter tax, but most new technologies have that.
Damage wrote:Don't try to game the requirements by posting everywhere, guys, or I'll nuke you from space.

-Probably the best Damage quote ever.
superjawes
Graphmaster Gerbil
Gold subscriber
 
 
Posts: 1090
Joined: Thu May 28, 2009 9:49 am

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 12:16 pm

Pville_Piper wrote:After reading yours and bakagetas' comments I don't agree. If you set up any video card to run at the highest settings that it can and still keep your frame rate above 40 fps than anybody will see a benefit. For instance, my GTX660 will allow me to run my settings on HIGH in BF4 and maintain 60 to 90 fps. With G-Synch I could enable most of the ULTRA settings and still stay above 40 fps. Or I could just leave them where they are an get rid of the tearing. Heck, with BF3 I could go full ULTRA and still see a very fluid game play.


I really don't agree that it takes a powerful card to take full advantage - I found us turning settings up far higher than we did without gsync, because while 40-50 fps seemed awful without it, it was perfectly smooth with it. It really does seem to let weaker cards still get a very smooth experience, as long as you can find some kind of settings that'll hold you at 40+. I wouldn't want to run it on something incredibly weak, but even a 650 or 660 should manage at modest settings. We'll be picking up a second 680 to throw at it in the next few days, so we can see what happens at higher performance levels, though, so stay tuned for that.

superjawes wrote:dreamsss, thanks for the screenshots. I don't think you meant to show it, but it appears that it is perfectly possible to keep G-Sync on when a non-G-Sync monitor is also connected to the video card.

Also, (for everyone concerned) don't worry about the cost of the kits and monitors too much. The mod kit isn't a permanent solution. It includes an FPGA (field programmable gate array), which is basically a reprogrammable (and expensive) circuit. Nvidia should be able to transition into ASICs (application specific integrated circuits) for the technology, which would virtually eliminate the price premium. There will be an early adopter tax, but most new technologies have that.


No clue how it'd work with multiple gsync monitors (I've heard it doesn't :( ), but yep, no problems keeping a second non-gsync screen on while gaming. We could hook a third monitor up for testing but I expect no issues. As for the cost, it'll certainly come down over time, with mass production and the move to asics. I'm just glad to have access to the fpga, I expect it'll be fun to poke at. Hopefully I don't screw anything up TOO badly, I doubt nvidia would be happy then.
bakageta
Gerbil In Training
 
Posts: 4
Joined: Wed Jan 01, 2014 12:10 am

Re: G-Sync Unboxing, Installation, and Impressions

Postposted on Thu Jan 02, 2014 4:08 pm

Tell us more about your perceptions! Be detailed!
Milo Burke
Gerbil Team Leader
Gold subscriber
 
 
Posts: 217
Joined: Thu Nov 07, 2013 11:49 am

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 4:41 pm

Merged impressions of GSync by Wingless into stick with other winners.
"Welcome back my friends to the show that never ends. We're so glad you could attend. Come inside! Come inside!"
Ryu Connor
Global Moderator
Gold subscriber
 
 
Posts: 3547
Joined: Thu Dec 27, 2001 7:00 pm
Location: Marietta, GA

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 5:58 pm

So G-synch is basically a mod for your monitor? Does this mean you could use it with any monitor, or even with a radeon card? I haven't been following this too closely (studying hard for CCNP and Adtran cert renewal exams).

Is this something OEMs might start installing themselves, so the end user doesn't have to? Hell, I think I'd prefer this over an oculus rift any day.
Hz so good
Gerbil Elite
 
Posts: 635
Joined: Wed Dec 04, 2013 5:08 pm

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 6:45 pm

bakageta wrote:
Pville_Piper wrote:After reading yours and bakagetas' comments I don't agree. If you set up any video card to run at the highest settings that it can and still keep your frame rate above 40 fps than anybody will see a benefit. For instance, my GTX660 will allow me to run my settings on HIGH in BF4 and maintain 60 to 90 fps. With G-Synch I could enable most of the ULTRA settings and still stay above 40 fps. Or I could just leave them where they are an get rid of the tearing. Heck, with BF3 I could go full ULTRA and still see a very fluid game play.


I really don't agree that it takes a powerful card to take full advantage - I found us turning settings up far higher than we did without gsync, because while 40-50 fps seemed awful without it, it was perfectly smooth with it. It really does seem to let weaker cards still get a very smooth experience, as long as you can find some kind of settings that'll hold you at 40+. I wouldn't want to run it on something incredibly weak, but even a 650 or 660 should manage at modest settings. We'll be picking up a second 680 to throw at it in the next few days, so we can see what happens at higher performance levels, though, so stay tuned for that.


I'm just popping in to point out that the weakest GPU that officially supports G-sync is the 650 Ti Boost, which is about 90% as fast as a 660. The 650 and the 650 Ti non-Boost don't get to use it.

I think it would do a world of good for other weaker cards, but it isn't supported.


Hz so good wrote:So G-synch is basically a mod for your monitor? Does this mean you could use it with any monitor, or even with a radeon card? I haven't been following this too closely (studying hard for CCNP and Adtran cert renewal exams).

Is this something OEMs might start installing themselves, so the end user doesn't have to? Hell, I think I'd prefer this over an oculus rift any day.


G-sync can only be added to this particular monitor in this way. Other monitors will be released with G-sync built in. I don't see any real reason it couldn't be used with AMD or Intel graphics, aside from the fact that this is Nvidia we're talking about.
Melvar
Gerbil First Class
 
Posts: 137
Joined: Tue May 14, 2013 11:18 pm

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 7:01 pm

Melvar wrote:G-sync can only be added to this particular monitor in this way. Other monitors will be released with G-sync built in. I don't see any real reason it couldn't be used with AMD or Intel graphics, aside from the fact that this is Nvidia we're talking about.


That's good to know! Around the time I finally buy a new monitor, maybe G-synch will have taken hold, and I'll get to experience it's purported buttery smoothness. I hope Nvidia aren't dicks about it, and lockout anything but GeForce cards from working with it, kinda like I hope AMD really does open Mantle up to anybody.
Hz so good
Gerbil Elite
 
Posts: 635
Joined: Wed Dec 04, 2013 5:08 pm

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 7:12 pm

Hz so good wrote:I hope Nvidia aren't dicks about it, and lockout anything but GeForce cards from working with it, kinda like I hope AMD really does open Mantle up to anybody.

You need to move to Colorado.
Life is hard; but it's harder if you're stupid. Big Al.
Captain Ned
Global Moderator
Gold subscriber
 
 
Posts: 20311
Joined: Wed Jan 16, 2002 7:00 pm
Location: Vermont, USA

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 7:20 pm

Captain Ned wrote:
Hz so good wrote:I hope Nvidia aren't dicks about it, and lockout anything but GeForce cards from working with it, kinda like I hope AMD really does open Mantle up to anybody.

You need to move to Colorado.



If I could find good work there, I'd do it in a heartbeat! Dunno how I'll handle the cold, being from the armpit of the deep south and all. At least I'm an ace snow skier! My grandpa was 10th mtn in WW2, and those guys basically built all the ski resorts in North America, so he made sure I knew how to ski at 5.
Hz so good
Gerbil Elite
 
Posts: 635
Joined: Wed Dec 04, 2013 5:08 pm

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 7:22 pm

Hz so good wrote:
Melvar wrote:G-sync can only be added to this particular monitor in this way. Other monitors will be released with G-sync built in. I don't see any real reason it couldn't be used with AMD or Intel graphics, aside from the fact that this is Nvidia we're talking about.


That's good to know! Around the time I finally buy a new monitor, maybe G-synch will have taken hold, and I'll get to experience it's purported buttery smoothness. I hope Nvidia aren't dicks about it, and lockout anything but GeForce cards from working with it, kinda like I hope AMD really does open Mantle up to anybody.

The biggest difference between G-Sync and Mantle is that G-Sync is a hardware implementation. The software side is little more than turning it on and off or adjusting certain settings, but the core functionality (setting the monitor to refresh when the GPU tells it to) is all hardware. That makes it much easier for a third party to use it. The only real reason to keep it Nvidia locked right now is that the hardware isn't actually finished, and Nvidia is still figuring out exactly how to implement the protocols and what the proper monitor ASIC needs to look like. The early adopters in this case are basically beta testers. Once all that testing is complete and the design is finalized, I see no reason why AMD cards couldn't tap into the functionality. Actually, AMD could probably work on drivers to use it now, but if something gets changed in the design, it could muck up the performance.

Personally, I think Mantle is a bit stickier when it comes to vendor locking because it is software. It might be marketed as "close to the silicon," but with significant software components, AMD could always issue a new patch or standard that borks performance for Nvidia and Intel chips. For this reason, I'm much less optimistic about cross platform performance.
Damage wrote:Don't try to game the requirements by posting everywhere, guys, or I'll nuke you from space.

-Probably the best Damage quote ever.
superjawes
Graphmaster Gerbil
Gold subscriber
 
 
Posts: 1090
Joined: Thu May 28, 2009 9:49 am

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 7:26 pm

Hz so good wrote:being from the armpit of the deep south and all.

Understood. The USAF forced me to spend 3rd grade in Montgomery, AL back in '71/'72.

As for G-Sync, it'll likely be a long time before I own a card and a monitor so capable. Gerbil herding doesn't exactly call for a whole lot of pixel power and my gaming these days is restricted to whichever SourceForge project makes original Doom/II look the best
Life is hard; but it's harder if you're stupid. Big Al.
Captain Ned
Global Moderator
Gold subscriber
 
 
Posts: 20311
Joined: Wed Jan 16, 2002 7:00 pm
Location: Vermont, USA

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 7:28 pm

superjawes wrote:
Hz so good wrote:
Melvar wrote:G-sync can only be added to this particular monitor in this way. Other monitors will be released with G-sync built in. I don't see any real reason it couldn't be used with AMD or Intel graphics, aside from the fact that this is Nvidia we're talking about.


That's good to know! Around the time I finally buy a new monitor, maybe G-synch will have taken hold, and I'll get to experience it's purported buttery smoothness. I hope Nvidia aren't dicks about it, and lockout anything but GeForce cards from working with it, kinda like I hope AMD really does open Mantle up to anybody.

The biggest difference between G-Sync and Mantle is that G-Sync is a hardware implementation. The software side is little more than turning it on and off or adjusting certain settings, but the core functionality (setting the monitor to refresh when the GPU tells it to) is all hardware. That makes it much easier for a third party to use it. The only real reason to keep it Nvidia locked right now is that the hardware isn't actually finished, and Nvidia is still figuring out exactly how to implement the protocols and what the proper monitor ASIC needs to look like. The early adopters in this case are basically beta testers. Once all that testing is complete and the design is finalized, I see no reason why AMD cards couldn't tap into the functionality. Actually, AMD could probably work on drivers to use it now, but if something gets changed in the design, it could muck up the performance.

Personally, I think Mantle is a bit stickier when it comes to vendor locking because it is software. It might be marketed as "close to the silicon," but with significant software components, AMD could always issue a new patch or standard that borks performance for Nvidia and Intel chips. For this reason, I'm much less optimistic about cross platform performance.


I've had bad experiences with vendor lock-in (lookin' at you Oracle), and it would be a shame if Nvidia let an important innovation go the way of Betamax, just because they wanted to sell more GeForce cards.
Hz so good
Gerbil Elite
 
Posts: 635
Joined: Wed Dec 04, 2013 5:08 pm

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 7:46 pm

Hz so good wrote:I've had bad experiences with vendor lock-in (lookin' at you Oracle), and it would be a shame if Nvidia let an important innovation go the way of Betamax, just because they wanted to sell more GeForce cards.

At worst it will only be a generation or two of cards, and even then, those generations will probably be good for refreshing monitors to include the functionality (more people replace monitors, more people get the benefit). Heck, having a variable refresh rate it good for any visual medium, so we could see it start to pop into television sets, too.
Damage wrote:Don't try to game the requirements by posting everywhere, guys, or I'll nuke you from space.

-Probably the best Damage quote ever.
superjawes
Graphmaster Gerbil
Gold subscriber
 
 
Posts: 1090
Joined: Thu May 28, 2009 9:49 am

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 8:04 pm

Hz so good wrote:it would be a shame if Nvidia let an important innovation go the way of Betamax, just because they wanted to sell more GeForce cards.


Again, this is Nvidia we're talking about. PhysX is basically worthless even for those of us with Nvidia cards because they won't let it run on AMD video cards (even though they have no problem licensing it for the new AMD based consoles), so no games use it for anything that actually affects gameplay.

I really hope Nvidia only has patents for this particular implementation of variable refresh rates, and not a patent on variable refresh rates themselves. They could hold display technology back for everyone for 20 years if they have that. Just like PhysX, even Nvidia users would suffer, because only a small number of monitors would ever support an expensive proprietary feature that doesn't work for well over half the people running PCs.
Melvar
Gerbil First Class
 
Posts: 137
Joined: Tue May 14, 2013 11:18 pm

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 8:07 pm

superjawes wrote:
Hz so good wrote:I've had bad experiences with vendor lock-in (lookin' at you Oracle), and it would be a shame if Nvidia let an important innovation go the way of Betamax, just because they wanted to sell more GeForce cards.

At worst it will only be a generation or two of cards, and even then, those generations will probably be good for refreshing monitors to include the functionality (more people replace monitors, more people get the benefit). Heck, having a variable refresh rate it good for any visual medium, so we could see it start to pop into television sets, too.


That would be nice. I'm all for better experiences for everybody. I've even dabbled in UI design, even though I'm the last person you would hire to program anything.
Hz so good
Gerbil Elite
 
Posts: 635
Joined: Wed Dec 04, 2013 5:08 pm

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 8:18 pm

Melvar wrote:
Hz so good wrote:it would be a shame if Nvidia let an important innovation go the way of Betamax, just because they wanted to sell more GeForce cards.


Again, this is Nvidia we're talking about. PhysX is basically worthless even for those of us with Nvidia cards because they won't let it run on AMD video cards (even though they have no problem licensing it for the new AMD based consoles), so no games use it for anything that actually affects gameplay.

I really hope Nvidia only has patents for this particular implementation of variable refresh rates, and not a patent on variable refresh rates themselves. They could hold display technology back for everyone for 20 years if they have that. Just like PhysX, even Nvidia users would suffer, because only a small number of monitors would ever support an expensive proprietary feature that doesn't work for well over half the people running PCs.



That's true, but then again, none of the open physics engines have set the world on fire, either. Nvidia has PhysX, iNTEL has Havok, What's to stop somebody from using OpenCL to port an open physics engine so we can all benefit?

/Guess I'm just too much of a dreamer...
Hz so good
Gerbil Elite
 
Posts: 635
Joined: Wed Dec 04, 2013 5:08 pm

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 8:25 pm

Captain Ned wrote:
Hz so good wrote:being from the armpit of the deep south and all.

Understood. The USAF forced me to spend 3rd grade in Montgomery, AL back in '71/'72.

As for G-Sync, it'll likely be a long time before I own a card and a monitor so capable. Gerbil herding doesn't exactly call for a whole lot of pixel power and my gaming these days is restricted to whichever SourceForge project makes original Doom/II look the best


HA! I was an Army brat. :D

We ended up in the deep south while I was still little, due to family being there.
Hz so good
Gerbil Elite
 
Posts: 635
Joined: Wed Dec 04, 2013 5:08 pm

Re: G-Sync Impressions

Postposted on Thu Jan 02, 2014 9:46 pm

Pville_Piper wrote:
xonar wrote: G-Sync requires a powerful video card to take advantage of its benefits.


After reading yours and bakagetas' comments I don't agree. If you set up any video card to run at the highest settings that it can and still keep your frame rate above 40 fps than anybody will see a benefit. For instance, my GTX660 will allow me to run my settings on HIGH in BF4 and maintain 60 to 90 fps. With G-Synch I could enable most of the ULTRA settings and still stay above 40 fps. Or I could just leave them where they are an get rid of the tearing. Heck, with BF3 I could go full ULTRA and still see a very fluid game play.
So I guess that my questions are answered, I do want to get a G-Synch monitor. I will just have to wait until the monitors come pre-built from Asus in order to afford it. I wish the kits were going to be around $100 then I would get a monitor now and upgrade later but for now I will wait and see where the price falls.


Sorry, I didn't literally mean it requires a good video card. What I meant more-or-less, was that G-Sync's highest benefits come from a good video card that can run very high visual quality at 40FPS. Yes, a GTX660 can benefit from G-Sync, but not to the same extent as a GTX780 can. Try playing Metro 2033/LL with SSAA enabled with G-Sync on your GTX660 and let me know how that goes... Yes, you can still benefit from G-Sync, but in terms of visual quality + fluidity, you cannot beat the benefits a high-end video card brings compared to a mid-range card. Note: I am not saying ultra high-end nor top of the line.

Also, G-Sync takes ~5% FPS hit enabled. It's called G-Sync, not G-Synch.
xonar
Gerbil In Training
 
Posts: 5
Joined: Tue Dec 17, 2013 6:46 pm

Next

Return to Graphics

Who is online

Users browsing this forum: Bing [Bot], Otty159 and 8 guests