Personal computing discussed
Moderators: renee, morphine, SecretSquirrel
xonar wrote:This stuff is the real deal. Being an avid FPS player I quickly fired up my favorite competitive FPS: CounterStrike Global Offensive. The game was nothing like I remember after installing G-Sync. It felt like I was gliding. Everything was smooth as butter. Of course, the game isn't very demanding; with Strobelight I would still get tearing when turning corners fast. With G-Sync, absolutely no tearing and the visual quality was spectacular.
Then I fired up Metro Last Light. This is the game where you can easily see how well G-Sync can work and its limitations. With highest settings except for SSAA enabled, there was 0 tearing and silky smooth picture. I'd stay at 60fps no problem. But, once I enabled SSAA and my FPS started dipping into the 30fps range, the effects of G-Sync disappeared. It was like running on a normal setup. Tons of stutter and tearing. Of course, this is to be expected since all the reviewers mention that.
xonar wrote:G-Sync requires a powerful video card to take advantage of its benefits.
Pville_Piper wrote:After reading yours and bakagetas' comments I don't agree. If you set up any video card to run at the highest settings that it can and still keep your frame rate above 40 fps than anybody will see a benefit. For instance, my GTX660 will allow me to run my settings on HIGH in BF4 and maintain 60 to 90 fps. With G-Synch I could enable most of the ULTRA settings and still stay above 40 fps. Or I could just leave them where they are an get rid of the tearing. Heck, with BF3 I could go full ULTRA and still see a very fluid game play.
superjawes wrote:dreamsss, thanks for the screenshots. I don't think you meant to show it, but it appears that it is perfectly possible to keep G-Sync on when a non-G-Sync monitor is also connected to the video card.
Also, (for everyone concerned) don't worry about the cost of the kits and monitors too much. The mod kit isn't a permanent solution. It includes an FPGA (field programmable gate array), which is basically a reprogrammable (and expensive) circuit. Nvidia should be able to transition into ASICs (application specific integrated circuits) for the technology, which would virtually eliminate the price premium. There will be an early adopter tax, but most new technologies have that.
bakageta wrote:Pville_Piper wrote:After reading yours and bakagetas' comments I don't agree. If you set up any video card to run at the highest settings that it can and still keep your frame rate above 40 fps than anybody will see a benefit. For instance, my GTX660 will allow me to run my settings on HIGH in BF4 and maintain 60 to 90 fps. With G-Synch I could enable most of the ULTRA settings and still stay above 40 fps. Or I could just leave them where they are an get rid of the tearing. Heck, with BF3 I could go full ULTRA and still see a very fluid game play.
I really don't agree that it takes a powerful card to take full advantage - I found us turning settings up far higher than we did without gsync, because while 40-50 fps seemed awful without it, it was perfectly smooth with it. It really does seem to let weaker cards still get a very smooth experience, as long as you can find some kind of settings that'll hold you at 40+. I wouldn't want to run it on something incredibly weak, but even a 650 or 660 should manage at modest settings. We'll be picking up a second 680 to throw at it in the next few days, so we can see what happens at higher performance levels, though, so stay tuned for that.
Hz so good wrote:So G-synch is basically a mod for your monitor? Does this mean you could use it with any monitor, or even with a radeon card? I haven't been following this too closely (studying hard for CCNP and Adtran cert renewal exams).
Is this something OEMs might start installing themselves, so the end user doesn't have to? Hell, I think I'd prefer this over an oculus rift any day.
Melvar wrote:G-sync can only be added to this particular monitor in this way. Other monitors will be released with G-sync built in. I don't see any real reason it couldn't be used with AMD or Intel graphics, aside from the fact that this is Nvidia we're talking about.
Hz so good wrote:I hope Nvidia aren't dicks about it, and lockout anything but GeForce cards from working with it, kinda like I hope AMD really does open Mantle up to anybody.
Captain Ned wrote:Hz so good wrote:I hope Nvidia aren't dicks about it, and lockout anything but GeForce cards from working with it, kinda like I hope AMD really does open Mantle up to anybody.
You need to move to Colorado.
Hz so good wrote:Melvar wrote:G-sync can only be added to this particular monitor in this way. Other monitors will be released with G-sync built in. I don't see any real reason it couldn't be used with AMD or Intel graphics, aside from the fact that this is Nvidia we're talking about.
That's good to know! Around the time I finally buy a new monitor, maybe G-synch will have taken hold, and I'll get to experience it's purported buttery smoothness. I hope Nvidia aren't dicks about it, and lockout anything but GeForce cards from working with it, kinda like I hope AMD really does open Mantle up to anybody.
Hz so good wrote:being from the armpit of the deep south and all.
superjawes wrote:Hz so good wrote:Melvar wrote:G-sync can only be added to this particular monitor in this way. Other monitors will be released with G-sync built in. I don't see any real reason it couldn't be used with AMD or Intel graphics, aside from the fact that this is Nvidia we're talking about.
That's good to know! Around the time I finally buy a new monitor, maybe G-synch will have taken hold, and I'll get to experience it's purported buttery smoothness. I hope Nvidia aren't dicks about it, and lockout anything but GeForce cards from working with it, kinda like I hope AMD really does open Mantle up to anybody.
The biggest difference between G-Sync and Mantle is that G-Sync is a hardware implementation. The software side is little more than turning it on and off or adjusting certain settings, but the core functionality (setting the monitor to refresh when the GPU tells it to) is all hardware. That makes it much easier for a third party to use it. The only real reason to keep it Nvidia locked right now is that the hardware isn't actually finished, and Nvidia is still figuring out exactly how to implement the protocols and what the proper monitor ASIC needs to look like. The early adopters in this case are basically beta testers. Once all that testing is complete and the design is finalized, I see no reason why AMD cards couldn't tap into the functionality. Actually, AMD could probably work on drivers to use it now, but if something gets changed in the design, it could muck up the performance.
Personally, I think Mantle is a bit stickier when it comes to vendor locking because it is software. It might be marketed as "close to the silicon," but with significant software components, AMD could always issue a new patch or standard that borks performance for Nvidia and Intel chips. For this reason, I'm much less optimistic about cross platform performance.
Hz so good wrote:I've had bad experiences with vendor lock-in (lookin' at you Oracle), and it would be a shame if Nvidia let an important innovation go the way of Betamax, just because they wanted to sell more GeForce cards.
Hz so good wrote:it would be a shame if Nvidia let an important innovation go the way of Betamax, just because they wanted to sell more GeForce cards.
superjawes wrote:Hz so good wrote:I've had bad experiences with vendor lock-in (lookin' at you Oracle), and it would be a shame if Nvidia let an important innovation go the way of Betamax, just because they wanted to sell more GeForce cards.
At worst it will only be a generation or two of cards, and even then, those generations will probably be good for refreshing monitors to include the functionality (more people replace monitors, more people get the benefit). Heck, having a variable refresh rate it good for any visual medium, so we could see it start to pop into television sets, too.
Melvar wrote:Hz so good wrote:it would be a shame if Nvidia let an important innovation go the way of Betamax, just because they wanted to sell more GeForce cards.
Again, this is Nvidia we're talking about. PhysX is basically worthless even for those of us with Nvidia cards because they won't let it run on AMD video cards (even though they have no problem licensing it for the new AMD based consoles), so no games use it for anything that actually affects gameplay.
I really hope Nvidia only has patents for this particular implementation of variable refresh rates, and not a patent on variable refresh rates themselves. They could hold display technology back for everyone for 20 years if they have that. Just like PhysX, even Nvidia users would suffer, because only a small number of monitors would ever support an expensive proprietary feature that doesn't work for well over half the people running PCs.
Captain Ned wrote:Hz so good wrote:being from the armpit of the deep south and all.
Understood. The USAF forced me to spend 3rd grade in Montgomery, AL back in '71/'72.
As for G-Sync, it'll likely be a long time before I own a card and a monitor so capable. Gerbil herding doesn't exactly call for a whole lot of pixel power and my gaming these days is restricted to whichever SourceForge project makes original Doom/II look the best
Pville_Piper wrote:xonar wrote:G-Sync requires a powerful video card to take advantage of its benefits.
After reading yours and bakagetas' comments I don't agree. If you set up any video card to run at the highest settings that it can and still keep your frame rate above 40 fps than anybody will see a benefit. For instance, my GTX660 will allow me to run my settings on HIGH in BF4 and maintain 60 to 90 fps. With G-Synch I could enable most of the ULTRA settings and still stay above 40 fps. Or I could just leave them where they are an get rid of the tearing. Heck, with BF3 I could go full ULTRA and still see a very fluid game play.
So I guess that my questions are answered, I do want to get a G-Synch monitor. I will just have to wait until the monitors come pre-built from Asus in order to afford it. I wish the kits were going to be around $100 then I would get a monitor now and upgrade later but for now I will wait and see where the price falls.