A first look at Nvidia’s G-Sync display tech


We got our first look
at G-Sync a couple of months ago at an Nvidia press event in Montreal, and we came away impressed with the handful of demos being shown there. Now, we’ve had the chance to spend some quality time with early G-Sync hardware within the comfy confines of Damage Labs, and we have much more to say about the technology. Read on to see what we think.

So what is G-Sync?

In order to understand G-Sync, you have to understand a little bit about how current display technology works. If you’ve been hanging around TR for any length of time, you probably have a sense of these things. Today’s display tech is based on some fundamental assumptions borrowed from ye olde CRT monitors—as if an electron gun were still scanning rows of phosphors inside of today’s LCDs. Among the most basic of those assumptions is the refresh cycle, where updates are painted on the display at rapid but fixed intervals. Most monitors are refreshed at a rate of 60 times per second, or 60Hz. Going a little deeper, most LCDs still paint the screen much like a CRT: updating rows of pixels from left to right, starting at the top of the screen and scanning down to the bottom.

Updating the screen at fixed intervals can be a fine way to create the illusion of motion. Movies and television do it that way, and so do video games, by and large. However, most motion picture technologies capture images at a fixed rate and then play them back at that same rate, so everything works out nicely. The rich visuals produced by graphics processors in today’s video games are different. Graphics chips produce those images in real time by doing lots of math very quickly, crunching through many billions of floating-point operations each second. Even with all of that power on tap, the computational workloads vary widely as the camera moves through a dynamic, changing game world. Frame rendering times tend to fluctuate as a result. This reality is what has driven our move to frame-time-based performance testing, and we can draw an example from one of our recent GPU reviews to illustrate how frame rendering times vary. He’s a look at how one of today’s faster graphics cards produces frames in Battlefield 4.

The plot above shows individual rendering times for a handful of frames. There’s really not tons of variance from frame to frame in this example, but rendering times still range from about 16 to 23 milliseconds. Zoom out a bit to look at a longer gameplay sequence, and the range of frame times grows.

The crazy thing is that the stem-winding plot you see above illustrates what we’d consider to be very decent performance. No single frame takes longer than 50 milliseconds to produce, and most of them are rendered much quicker than that. In the world of real-time graphics, that’s a nice looking frame time distribution. As you can imagine, though, matching up this squiggly plot with the regular cadence of a fixed refresh rate would be pretty much impossible.

Here’s what’s crazy: that impossibility is at the heart of the interaction between GPUs and displays constantly, with every frame that’s produced. GPU rendering times vary, and display refresh rates do not. At its lowest level, the timing of in-game animation is kind of a mess.

For years, we’ve dealt with this problem by choosing between two different coping mechanisms, neither of them particularly good. The usual default method is a technology called vsync, or vertical refresh synchronization. Vsync involves storing completed frames in a buffer and only exposing a fresh, buffered frame when the time comes to paint the screen. This technique can work reasonably well when everything else in the system cooperates—when frames are coming out of the GPU at short, regular intervals.

Frame rendering times tend to vary, though, as we’ve noted. As a result, even with some buffering, the system may not have a frame ready at the start of each new refresh cycle. If there’s no new frame to be displayed when it’s time to paint the screen, the fallback option is to show the preceding frame once again and to wait for the next refresh cycle before flipping to a new one.

This wait for the next refresh interval drops the effective frame rate. The usual refresh interval for a 60Hz display is 16.7 milliseconds. Turn in a frame at every interval, and you’re gaming at a steady 60 FPS. If a frame takes 16.9 milliseconds to render—and is just 0.2 ms late to the party—it will have to wait the remaining 16.5 ms of the current interval before being displayed. The total wait time for a new frame, then, will be 33.3 ms—the equivalent of 30 FPS.

So the consequences for missing a single refresh interval are dire: half the performance and presumably half the perceived smoothness. Things get worse from there. Missing two intervals, with a frame that requires just over 33.3 ms to produce, delays the display update to 50 ms in length (equal to 20 FPS). Missing three intervals takes you to 66.7 ms, or 15 FPS. Those are your choices: 60 FPS, 30 FPS, 20 FPS, 15 FPS, and so on.

Now imagine what happens in action, as vsync works to map a wavy, up-and-down series of rendered frames to this stair-step series of effective animation rates. Hint: it ain’t exactly ideal. Here are a couple of examples Nvidia has mocked up to illustrate. They’re better than the examples I failed to mock up because I’m lazy.


Source: Nvidia.

This stair-step effect is known as quantization, and it’s the same effect that, in digital audio, can cause problems when mapping an analog waveform to a fixed sampling rate. Heck, I’m pretty sure we’re hearing the effects of intentionally exaggerated quantization in today’s autotune algorithms.

Quantization is not a friend to smooth animation. The second scenario plotted above is fairly common, where frame rendering times are ranging above and below the 16.7-ms threshold. The oscillation between update rates can lead to a halting, uneven sense of motion.

That’s true not just because of the quantized update rates alone, but because of the side effects of delaying frames. When buffered frames waiting in the queue are finally displayed at the next refresh interval, their contents will be temporally out of sync with their display time. After all, as frames are generated, the game engine has no knowledge about when they’ll be displayed. Also, buffering and delaying frames adds latency to the input-response feedback loop, reducing the immediacy of the experience. You’ll wait longer after clicking the mouse or pressing a key before you begin to see the corresponding action taking place onscreen.

Nvidia calls this quantization effect stuttering, and I suppose in a sense it is. However, I don’t think that’s a helpful term to use in this context. Display refresh quantization is a specific and well-understood problem, and its effects are distinct the from longer, more intermitted slowdowns that we usually describe as stuttering.

The downsides of vsync are bad enough that many gamers have decided to opt for disabling it, instead. Turning off vsync is faster and more immediate, but it means the GPU will flip to a new frame while the display is being drawn. Thus, fragments of multiple rendered frames will occupy portions the screen simultaneously. The seams between the frames are sometimes easy to see, and they create an artifact called tearing. If you’ve played a 3D game without vsync, you’ve probably seen tearing. Here’s a quick example from Borderlands 2:

Tearing is huge penalty to pay in terms of visual fidelity. Without any synchronization between GPU render times and frame display times, tearing is likely to be happening somewhere onscreen almost all of the time—perhaps multiple times per refresh cycle, if the GPU is pumping out frames often enough. As with quantization, the type of game and the nature of the motion happening in the game world will influence how readily one perceives a problem.

Like I said, neither of these coping methods is particularly good. G-Sync is intended to be a better solution. G-Sync’s goal is to refresh the display when the GPU has a frame ready, rather than on a fixed schedule. One could say that G-Sync offers a variable refresh rate, but it’s more about refresh times than rates, since it operates on a per-frame basis.

On a fast display with a 144Hz peak refresh rate, G-Sync can vary the refresh interval between 6.9 and 33.3 ms. That first number, 6.9 milliseconds, is the refresh interval at 144Hz. The second is equivalent to 30Hz or 30 FPS. If a new frame isn’t ready after 33.3 ms, G-Sync will paint the screen again with the prior frame. So the refresh interval isn’t infinitely variable, but it does offer pretty wide leeway.

In theory and in practice, then, G-Sync is easily superior to the alternatives. There’s no tearing, so the visual integrity of displayed frames isn’t compromised, and it provides almost immediate display updates once a frame is ready. Even though GPU frame rendering times vary, G-Sync’s output looks smoother than the quantized output from traditional vsync. That’s true in part because each frame’s contents more closely corresponds to its display time. G-Sync also reduces the wait time imposed by the display refresh cycle, cutting input lag.

G-Sync isn’t a perfect solution by any means. It doesn’t eliminate the left-to-right, top-to-bottom motion by which displays are updated, for instance. The 33-ms frame time cap is a little less than ideal, too. Still, this is a far sight better than the antiquated approaches we’ve been using for years.

The hardware

If you’ve managed to wade through these explanations so far, you might be wondering how hard it is to make something like G-Sync work. After all, the GPU acts as the timing source for the display. Couldn’t a variable refresh scheme be implemented in software, perhaps via a GPU driver update?

Turns out it’s not that simple. Doing refreshes at varying intervals creates all sorts of havoc for LCDs, including color/gamma shifting and the like. I had no idea such problems were part of the picture, but happily, Nvidia seems to have worked them out. You’d never know about any potential for trouble when seeing the early G-Sync solutions in action.

Still, making G-Sync work requires much more than a GPU driver update. Nvidia has developed a new control module for LCD monitors. Displays must be equipped with Nvidia’s custom electronics in order to support the new refresh scheme.

The first-gen G-Sync module is pictured on the right. The biggest chip on the module is an FPGA, or field programmable gate array, which can be made to perform a range of custom tasks. In this case, the FPGA is serving as the development and early deployment vehicle for G-Sync. The FPGA is accompanied by a trio of DDR3 memory chips, each 256MB in capacity. I doubt the module requires all 768MB of memory to do its thing, but it likely needs the bandwidth provided by three separate memory channels. Nvidia tells me this first G-Sync module can handle 4K resolutions at refresh rates up to 60Hz.

We don’t yet have a lot of details on G-Sync monitors and how they’ll be priced. I expect to learn a lot more at CES next week. However, the early info seems to indicate that the G-Sync module will add about $100 to the price of a display. That’s a considerable premium, but I expect Nvidia could cut costs considerably by moving from an FPGA to a custom chip. In fact, since a G-Sync control chip would replace a traditional scaler ASIC, it shouldn’t add much cost at all to the total solution—eventually.

Nvidia seems to be intent on keeping G-Sync proprietary for the time being. As a result, if G-Sync succeeds in winning wide adoption, then Nvidia will have established itself as a provider of display electronics. Also, folks who want to take advantage of G-Sync tech will need to own a GPU based on the Kepler generation of technology—essentially any GeForce GT or GTX 600- or 700-series graphics card.

Although having an open standard for display technology of this sort would be ideal, there’s reason be pleased about Nvidia’s involvement in display chips going forward. There’s plenty of room for further innovation beyond the first-gen G-Sync tech. For instance, at the event in Montreal, John Carmack mentioned a low-persistence mode that, similar to Nvidia’s LightBoost tech for 3D glasses, strobes the display backlight after each frame has been painted in order to reduce motion blur. I don’t think that mode is ready for prime time yet, but that sort of technology could further mitigate the weaknesses of current displays.

We got the chance to take an early look at G-Sync in action using the monitor above. That’s an Asus VG248QE that’s been retrofitted with a pre-production G-Sync module. This display is marketed to gamers, and it’s based on a 24″ TN panel, with a 1920×1080 resolution and all of the attendant goodness and badness that comes with that sort of display. It’s blazing fast, with a 144Hz refresh rate and a claimed one-millisecond gray-to-gray response time, which makes it ideal for this sort of mission. Beyond that, to my eyes spoiled by constant exposure to glorious IPS panels, there’s little to recommend it. The color reproduction is poor, with obvious banding on simple gradients, and the panel has more light bleed than my neighbor’s Christmas display. The resolution, pixel density, panel size, and viewing angles are all relatively poor.
(Told you my eyes were spoiled. You try sitting in front of an Asus 4K panel for several weeks and then switching to something else.)

Once you have the right hardware and Nvidia drivers installed, using G-Sync is fairly straightforward. Just set the refresh rate to 144H, pop over to the “Manage 3D settings” portion of the Nvidia control panel, and choose “G-Sync” as the vertical sync method. After that, G-Sync should be working.

There are a few limitations to navigate, though. In order to use G-Sync, games and other applications must be running in full-screen mode, not in a window. Also, many games have their own vsync settings and don’t entirely comprehend what’s happening when G-Sync is enabled. Nvidia recommends setting vsync to “disabled” in any in-game menus. Even then, some games will limit frame rates to 60Hz, so you may need to tweak the “Preferred refresh rate” option shown above in the Nvidia control panel to “highest available.” Doing so allowed me to work around problems with a couple of games, including Call of Duty: Ghosts, without any obvious negative side effects. However, really high frame rates in Skyrim can make that game’s physics engine go bonkers, so handle with care.

So how well does it work?

Once I had the G-Sync setup working and ready to roll, I was hit by a realization that prompted a bit of soul searching. You see, as I’ve mentioned, I spend most of my computer time looking at IPS displays with amazing color reproduction. Those same displays, though, are largely limited to 60Hz refresh rates. That’s fine as far as it goes, but I’m very much aware that higher refresh rates can make for smoother gaming.

If you haven’t played with a 120 or 144Hz monitor, and if you’ve heard the oft-repeated myth about the human eye not being able to detect anything faster than 60 FPS, you might think that high-refresh displays aren’t any big deal. You’d be wrong. Just dragging an Explorer window across the desktop at 60Hz and then at 120Hz in a side-by-side setup is enough to illustrate how much smoother motion can look at twice the frame rate. There’s no question that a fun and fluid gaming experience is possible at 60Hz, but taking things up to 120 or 144Hz? More fluid and more fun, especially for quick-twitch games like first-person shooters.

I realized I needed to be careful about understanding the benefits of G-Sync above and beyond the goodness provided by a high-refresh panel.

For its part, Nvidia doesn’t seem to see any contraction here. After all, part of its pitch is about the benefits of the faster displays that will be equipped with G-Sync. Also, interestingly enough, Nvidia claims G-Sync’s most obvious benefits will come at lower performance, when frame rates are ranging between 40 and 60 FPS. I think that’s true, but the benefits of fast displays at high frame rates aren’t exactly trivial, either. They’re just not entirely exclusive to G-Sync.

Once I cranked up the G-Sync setup and began to play games, though, it didn’t take long to squelch my worries. My concerns about where the goodness came from, my snobby complaints about light bleed and color banding—these things all melted away as I was sucked into the reality unfolding with freakish fluidity in front of my eyes.

I’ll tell you more about my impressions shortly, but this is TR, where we’re slaves to empirical testing when possible. Let’s have a look at some slow-motion video captures.

Some slow-mo video examples

I wasn’t sure I could capture the benefits of G-Sync with my cheap slow-motion video camera, but it wasn’t terribly hard to come up with a meaningful example. All I had to do was find a scenario where the differences between vsync, no vsync, and G-Sync were apparent to the naked eye.

In this particular case, I wanted to compare the benefits of G-Sync to regular vsync on our fast 144Hz display. The scenario I came up with is in Skyrim, at 1080p resolution, on a GeForce GTX 660 graphics card. Frame rates fluctuate from about 60 to 85 FPS, and in this case, the camera pans around my character in a 360° arc, creating the sort of motion where both tearing and vsync quantization are easily spotted.

I recorded the results at 240 FPS, substantially faster than the display can refresh itself. Please pardon the low resolution and especially the different exposure levels. The camera auto-adjusted to make the scene look darker at lower refresh rates, and there wasn’t anything I could do to remedy that. That said, I still think these little videos, which came out of the camera unretouched and went straight onto YouTube, do a decent enough job of illustrating the relative smoothness and motion-related artifacts of the various display modes. Let me walk you through them.

At 60Hz without vsync, you can see how motion steps forward with each display refresh in very noticeable increments. Also, tearing is readily apparent. Oftentimes, multiple “seams” between rendered frames are onscreen at once. The tearing becomes easiest to notice starting at about 20 seconds into the video, as the city walls and the cathedral lurch by unevenly.

In this case, since frame rendering rates are generally above 60 FPS, switching on vsync looks like a clear win. The tearing artifacts are gone, and the animation advances fairly evenly. However, objects near or far from the camera, in the foreground and background, still advance in relatively large steps from one frame to the next.

Ah, see? At more than twice the refresh rate of the previous example, the animation looks much more fluid. However, without vsync, tearing is still prominent. The worst of it begins at about 16 seconds into the video, as the advancing hillside and walls appear to “waggle” from top to bottom as new and old frames intermix onscreen. Kinda ugly.

Here’s the near-perfect experience offered by G-Sync. The tearing is gone, the waggle is banished, and the animation is vastly more fluid. At full speed, the result is glassy smooth perceived motion. Compare this video to the one at 60Hz without vsync, which is how most hard-core gamers tend to play, and the contrast is stark. Good grief. What have we been doing all these years?

Once you’ve seen the G-Sync video, you can appreciate how even this mode, with vsync enabled at 144Hz, isn’t ideal. Watch as the cathedral swings by in the background with the halting, uneven motion caused by vsync quantization. Almost looks as if the frame update intervals are short-short-long, short-short-long, and so on. The animation looks a darn sight better than vsync at 60Hz, but even at 144Hz, G-Sync’s output is simply more correct and desirable.

Further impressions

As the videos demonstrate, G-Sync has tangible benefits over a fixed-rate 144Hz monitor, with or without conventional vsync. Nvidia says the improvements will be most easily perceptible at lower frame rates, from 40-60 FPS, and that makes quite a bit of sense. After all, that’s where vsync quantization is the worst. I tried several things to put this theory to the test.

First, I cranked up the image quality in games like Crysis 3 and Arkham Origins to get frame rates to drop into the range in question. I then compared G-Sync at 144Hz (max) to conventional vsync at 60Hz and 144Hz. G-Sync’s benefits were obvious over regular vsync at 60Hz in these cases, even more so than in our Skyrim example with higher frame rates. The difference was subtler at 144Hz, but G-Sync was still an incremental improvement over conventional vsync. Again, the degree to which you’d notice the difference was dictated by the type of movement happening in the moment.

Out of curiosity, I also wanted to see what G-Sync could do for a slower display, like an IPS panel with a 60Hz peak refresh rate, so I used the Nvidia control panel to cap the monitor at 60Hz. At these frame rates and with vsync enabled at 60Hz, frame output in Arkham Origins was often quantized to 30Hz or less. As a result, the game felt sluggish, with somewhat jerky animation. Turning on G-Sync with a 60Hz upper limit was a major improvement. It wasn’t quite as nice a G-Sync at 144Hz, of course, but playability was clearly improved. There’s a case to be made for outfitting a 60Hz panel with G-Sync control logic, no question.

However, I have to admit that the majority of my G-Sync testing time was spent at much higher frame rates, in Borderlands 2, a glassy-smooth Unreal Engine-based shooter with a frenzied, kinetic feel. This is my favorite game, and it’s my favorite way to use a brand-new gaming-focused display tech. I played on the GeForce GTX 660 at frame rates from 60-85 FPS. I played on a GTX 760 with frame rates ranging to 100 FPS and beyond. And I played on a GeForce GTX 780 Ti, where frame rates went as high as 120-144 FPS. Holy crap, it was awesome in every case, and more so with the faster graphics cards.

I tend to have a wee bit of an addictive personality, and playing BL2 with G-Sync’s creamy smoothness fed that tendency in wonderful and dangerous ways. That’s probably why this G-Sync write-up here is conspiciously late, and it’s definitely why my Christmas shopping was a perilously last-minute affair. I played through the rest of the Tiny Tina DLC, burned through the Headhunter packs, and found myself pondering another play-through when I finally had to stop myself and focus on Yuletide cheer rather than looting another bandit. Borderlands 2 offers very immediate feedback; it doesn’t use double- or triple-buffering, which is why its FCAT and Fraps results closely correlate. The speed and fluidity of that game on this G-Sync display is like a combo meth/crack IV drip. I could probably make “faces of G-Sync” a thing, if others are affected by it like I am.

I should note that not even a wicked-fast graphics card, a Sandy-Bridge-E CPU, and a G-Sync display can deliver perfectly smooth gameplay all of the time. You will notice the occasional hitch where a frame or two has taken too long to draw. We measure those all of the time in our performance tests, and the really fast, low-latency nature of G-Sync makes those hiccups keenly felt. G-Sync doesn’t solve every problem, even though it’s a vast improvement in display synchronization.

What’s next?

So I’m a fan of G-Sync as a technology. The questions now are about how the tech gets implemented. At present, you can only get G-Sync upgrade kits from select vendors, for installation in this Asus monitor—or you can pay extra to have the module installed for you. I suspect Asus will start selling a version of the monitor with the G-Sync module already integrated before long.

That will be a good start, but there’s still plenty of room for improvement. The Asus VG248QE monitor’s LCD panel just isn’t very good, even by TN standards. Some TN panels are pretty decent, believe it or not, and if we can’t get IPS displays in the first wave, I’m hoping we’ll see a few higher-quality TN panels built into new G-Sync monitors unveiled at CES. Nvidia’s module is capable of driving everything up to 4K displays, so here’s hoping we have a variety of choices as the year progresses.

I can tell you what I’d like to see happen. I’d like to see somebody produce a G-Sync-compatible monitor based on a 27″ IPS panel with a 2560×1440 pixel grid at a reasonable price. We already know that folks have had some success overclocking their cheap 27″ Korean IPS monitors to 85Hz, 100Hz, or better. An affordable 27″ IPS monitor with a peak refresh of 85 or 100Hz would be a very nice thing indeed. In my view, a panel of that sort with G-Sync and a low refresh interval would be a vastly superior choice for gaming than even one of the fancy 60Hz 4K panels here in Damage Labs. You’re better off spreading your pixels per second across successive frames than jamming them all into really tight pixel pitches, honestly.

Looking further ahead, one would hope that something like G-Sync could become a standard for computer displays at some point in the non-too-distant future, so the entire industry can embrace this sort of functionality in a broadly supported standard. Because it’s very much the right thing to do.

Twitter implements a 140-character low-pass filter on my thoughts.

Comments closed
    • dreamsss
    • 6 years ago

    anyone who denies the benefits of gsync does not deserve, i was one of the winners of the first kits and DAMN.. its such a huge difference. its not a gimmick and its not an imaginary change, while the price does not make it accesable to everyone, its WELL worth it

    everytime my gf looks at my game (ff14) for example, she comments on how great my games look.

    “the speed and fluidity of that game on this G-Sync display is like a combo meth/crack IV drip. I could probably make “faces of G-Sync” a thing, if others are affected by it like I am.” <- its true

    • Freon
    • 6 years ago

    As much as it looks like a great thing, I’m not paying such an insane premium for a 1080P TN panel. G-sync looks nice, but not that nice. I got 110hz out of my cheap-ass 1440p display (and just got a 24AWG cable to try again at 120+), and that greatly reduces the effects of stuttering with Vsync on or tearing with Vsync off. No proprietary hardware. No 1080p limit. No massive premium for a crummy TN panel. It’s not perfect, but it was about half as much one of these 1080P TN jobs, and works great as a desktop display.

    The tech fundamentally looks great, but Nvidia’s closed attitude could kill it.

      • Milo Burke
      • 6 years ago

      I feel like all of your complaints are only appropriate for the first batch. Of course the early models are extra expensive and limited on features. And the early adopters pay the price. But they also pave the way for future tech to reach us at affordable prices.

      Truthfully, I’m really excited about G-sync. I think this technology is exactly what the gaming industry has needed. I’m sure prices will fall, and I’m sure restrictions like TN or 1080p will be dropped. And whether I’ll be using brand-name G-sync or some other alternative in 12-18 months, I don’t care so much. I’m just really looking forward to such a smooth experience without paying ridiculous sums for extreme performance hardware.

      Personally, ~50 FPS on a variable-refresh rate monitor will be better than any gaming experience I’ve ever had. (Even though it’s clearly not the best.) And I’m already looking forward to it.

        • Modivated1
        • 6 years ago

        I am interested in getting a monitor that will eliminate stuttering too, it’s not my highest priority but now that it will be offered in one form or another for either GPU manufacturer I will buy the one that fits my system.

        I will give Nvidia credit for bringing attention to this, even though the foundation had been laid out for this ability a long time ago it probably would never actually been incorporated without this attention.

      • Airmantharp
      • 6 years ago

      As Milo said above, this is just round one.

      We should expect both that a much wider variety of monitors will come with G-Sync equipped, and that the price for G-Sync over a non-G-Sync display should drop precipitously. And that’s assuming that you can get the same display without G-Sync, since the technology replaces parts too.

      Just be patient; my bet is both that G-Sync will be more wide-spread than what we’re seeing at the outset, that the price will become reasonable very quickly, and that compatibility with other GPU vendors won’t be as ambiguous as it is today.

    • WaltC
    • 6 years ago

    Where’s the G-sync 60Hz example? I see G-sync @ 144Hz, but not at 60Hz.

    Not trying to be “cute” with the question, either, but I have to say my Skyrim setup at home looks much closer to your 144Hz G-Sync example than it does to your 60Hz non-G-sync examples even though I have a ton of very high-res textures loaded in the game, run a single 7850 2GB @ 1.05GHz, 1920×1200, and a 60Hz monitor which I generally run with vsync off. I don’t see anywhere near that level of, as nVidia misnames it, “stutter.” I think your examples were good in that they correctly illustrate the principle of what nVidia’s hoping to do here with G-sync, but in practice my non G-sync display looks much better than your non G-sync 60Hz examples. I’m sure there are many variables in the situation, though.

      • Airmantharp
      • 6 years ago

      If you have V-Sync off, you have less perceived stutter, because you have tearing instead. It’s one or the other, except that with V-Sync on, you also get input lag.

      But I’d still like to see a 60Hz comparison, if that’s possible.

      • superjawes
      • 6 years ago

      Um…

      1. The G-Sync monitor is 144 Hz, so there is no 60 Hz example. The 60 Hz video is to illustrate the difference between 60 and 144 Hz because Scott recognizes that he’s used to 60 Hz. That means that some of the perceived smoothness could be coming from the higher refresh rate (lower refresh time) rather than from G-Sync itself.

      2. It’s slow motion video. Everything [i<]should[/i<] look like it's stuttering because it's a series of still images. At full speed, with all other things being equal, 144 Hz and G-Sync should make everything appear smoother, or at least crisper. EDIT: even if you can set the G-Sync monitor to 60 Hz and turn G-Sync on, I am not sure it would change the refresh time, so it wouldn't necessarily be a true example of G-Sync at 60 Hz.

        • WaltC
        • 6 years ago

        OK, so in other words you’re telling me that the examples that were meant to be representative of G-Sync actually aren’t. I take it that this means you mostly agree with me…;)

          • superjawes
          • 6 years ago

          No, I’m telling you that everything looks choppy because it has been slowed down. Slow down any animation, movie, or even .gif and it would look like a slide show. At full speed, 60 Hz will look smoother, but G-Sync would look even smoother than that, and you should be able to get the smoothness while making fewer sacrifices to eye candy (textures, filtering, etc.).

      • Airmantharp
      • 6 years ago

      [url=https://www.youtube.com/watch?v=y_uhrLySk64<]G-Sync at 60Hz, compliments of Anand.[/url<] Based on that, I'm sold on G-Sync with a higher-end IPS monitor that can be reasonably calibrated for photo work.

        • WaltC
        • 6 years ago

        I guess people are down voting my sincere query because they simply don’t believe me when I say my non-G-Sync 60Hz display looks much closer to the G-Sync examples than it does to the purported non-G-sync examples. Glad to see the 60Hz example from AnandTech, but have to say, again, that my 60Hz, vsync off, BSI is *nowhere near* as rough as his “example” of a non G-Sync display running BSI. I really wish I could say different, because I had thought the G-Sync idea interesting, but I can’t.

          • Waco
          • 6 years ago

          That’s because without Vsync on you’re not getting stuttering, you’re getting tearing…

          Perhaps you should take a 720p60 video of your screen while playing Skyrim and try slowing it down by 50%? I imagine you’d see both stutter and tearing.

          This tech (and AMD’s implementation as well) come much closer to solving the fluidity problem than anything before. With regular setups it’s like the 3:2 pulldown jutter when watching 24 Hz movies on a 60 Hz screen…

    • Arclight
    • 6 years ago

    [quote<]G-Sync will paint the screen again with the prior frame. So the refresh interval isn't infinitely variable, but it does offer pretty wide leeway. [/quote<] Uuuuuu. I think i've experienced this a few times in multiplayer games due to lag (the cause was obviously different but the results should be the same) and it's WAY more annoying than tearing. Most of the games that i play don't tear so bad for me to be bothered although granted there have been a few exceptions like Rage which even i couldn't play it without Vsynch. [url<]https://www.youtube.com/watch?v=y_uhrLySk64[/url<] It can be noticed i think around the 0:46 mark.

    • Modivated1
    • 6 years ago

    Despite my being a Fan of AMD tech I have to apologize for the passion that I applied (in this case) concerning Nvidia’s intent to purposely develop proprietary Tech. Apparantly after reading about an AMD version that has to do the same thing.

    The nature of the technology is that it aligns with the GPU to pace it’s frames as the GPU puts them out. Now it’s obvious that a GPU is custom tech and compatibility must be a match for this to work but I like many others have developed tunnel vision and pointed the finger Villanizing Nvidia.

    So in spite of my cheerleader-ness I would like to say that I am sorry.

    Anyway, for all of you AMD fans out there here is basically the same thing that has been developed by AMD.

    [url<]http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014[/url<] It will be given to LCD vendors and allowed to be implemented at no extra cost. Thus it is named... Free-Sync. Don't believe me click the link.

      • Meadows
      • 6 years ago

      It is a standard that not all LCDs support, making it no better than G-sync. Also, the targeted devices are low-power to begin with and their screens would never do more than 60 Hz.

      Still, from a power conserving standpoint, this might be useful. I doubt AMD will have a real answer to G-sync this year. Every time they’re silent like this, it usually means they’re caught off-guard.

        • Modivated1
        • 6 years ago

        In any case, if you read my article it was more than to point out Free-Sync, it was to acknowledge that this kind of tech is naturally proprietary and therefore the criticism’s against Nvidia are misplaced. I know that AMD’s tech is also proprietary, that is what opened my eyes.

        As for being as good or not, I will wait until I can see it to believe it. As it stands the article above says that there is still the occasional tare so I imagine that the tech is infant and still needs to be improved (much like 3D).

        As long as it does the job and I can combo it with Mantle then I will be happy.

    • itachi
    • 6 years ago

    AHha at the end of conclusion you exactly said what I was thinking since I heard about G-Sync, a 2500x reso 144hz Gsync monitor, then my hard earned money will be well spent I think :).

    • TwoEars
    • 6 years ago

    I finally got the time to read this article and I just want to say thanks for a absolute fantastic write up!

    The slow-mo captures of higher frame rates was the perfect illustration, and I really agree with a lot of your thoughts and conclusions.

    27″ 2560×1440 IPS panel with G-Sync and 85-100Hz?

    Bring it on! It’s what I’ve been waiting for. Can also be VA panel, they’re good enough.

    That 1080P eizo gaming monitor with VA panel would be fantastic if they made it G-Sync and 2560×1440 instead.

    • TrptJim
    • 6 years ago

    I think Nvidia should go all the way and produce their own line of G-Sync monitors. They are making the G-Sync logic boards already, so it wouldn’t be too far of a leap. They could push for high resolution, 100hz+ IPS panels and overclocking in a way other manufacturers have shied away from.

    • cmrcmk
    • 6 years ago

    Scott, I don’t know why you’re getting so much backlash from this piece. I really appreciated it. Even beyond knowing more about Gsync, this article convinced me that I need to pay more attention to refresh rates on my monitor (unfortunately, my 27″ 1440p monitor is only a year old so I don’t think I’ll get to replace it any time soon).

    • Bensam123
    • 6 years ago

    There VG248QE definitely needs a color profile in order to look remotely pretty. You also need to adjust the brightness down quite a bit (like 18). I disagree about trying to replicate glassy smooth goodness on a IPS panel (which will turn into water paints) over a TN… but we all have our preferences. It’s entirely possible to get a nice looking TN monitor.

    All of this sounds good except for the 33ms minimum and also the proprietary nature of this. They definitely have something here, they also had something with PhysX and we saw how that is squandered over the years. I’m sure AMD already has something in the works just like this, only it’ll be open and available for everyone… including Nvidia and G-Sync will wither away. Which is quite sad as they definitely developer this first. I definitely like the prospect of this, but I’m not buying another $400 card for it… Just the same, if this was available for AMD users only I’m sure Nvidia users wouldn’t want to drop another $400 on a different video card even though theirs works perfectly fine.

    So I’m just going to cover my ears and hum really loudly and hope AMD comes out with their own version. I’d also rather not spend $200 on a upgrade card.

      • mdrejhon
      • 6 years ago

      There’s no “hard” 33ms minimum. It’s only a frametime skew — the repeat refresh cycle only skew the subsequent frame timing by, at most +6.9ms.

      On the VG248QE, each G-SYNC refresh takes 1/144sec (6.9ms) regardless of current G-SYNC refresh rate — 6.9ms to scan from top-to-bottom.

      The repeat refreshes doesn’t inject major stutter at lower framerates, it just would inject some minor microstutters (microstutter error of 6.9ms during frame intervals of slightly above 33ms) only during certain framerates very close to 30fps (and certain lower numbers).

      Framerates such as 24fps would look G-SYNC native.
      This is because once frametimes go to (33ms+6.9ms) = ~40ms … basically 25fps or less; everything looks like it’s natively G-SYNC’d (e.g. 23fps, 24fps and 25fps will look as if it’s natively supported, as the next frame would arrive after the repeat re-refresh was already finished). At least until framerates go all the way down to ~15fps (when two repeat refreshes are needed — 66ms) But at that point, repeat-refresh microstutters would be quite indistinguishable (+6.9ms during an already-choppy 66ms)

      That said, I did argue on AVSFORUM that this technology needs to be open, to be implemented as part of HDMI 3.0. (it’s a very good thread over there).

    • Pax-UX
    • 6 years ago

    While the benefit is obviously better, it’s just not worth the kind of money nivida what for this feature. I’ll only pick this up once my current monitor gets too out of date or breaks. Normally that takes years.

    • alienstorexxx
    • 6 years ago

    anyway, i think the future will be to have 2 gpus. just because all this exclusive technologies and the big ego of people in front of those companies.
    amd and nvidia are aware that pc gaming is growing and want prepare the battlefield for it.

    pc future will be just like consoles…
    nvidia gameworks games (or maybe something new coming with maxwell)
    amd mantle games
    steamos games

    and then,
    ps4 games
    xbox one games
    android games
    ios games
    and of course windows phone games too

    this is just getting better and better.

    • balanarahul
    • 6 years ago

    “it doesn’t use double- or triple-buffering, which is why its FCAT and Fraps results closely correlate”

    Wait what?? It may not use triple buffering. But it has to use double buffering. There is no such thing as single buffering.

      • auxy
      • 6 years ago

      Sure there is. Each frame is scanned out to the monitor immediately after composition. You can enable this on any app in the Nvidia control panel by setting Maximum Pre-rendered Frames to its minimum setting, which I believe is 0?

    • Tamale
    • 6 years ago

    Very cool tech. Really hope it’s not an nvidia only thing eventually though. Nice writeup!!

    • Milo Burke
    • 6 years ago

    I was really disappointed not to see the traditional price-performance scatter plot on the last page. Would you please insert a scatter plot with one data point simply to remind me I’m at TechReport?

    • auxy
    • 6 years ago

    Why does it have to be IPS vs TN, Scott? [super<]And by the way, the VG248QE [b<]does[/b<] have one of the best TN panels available going off the specs.[/super<] I've said it before and I'll never stop saying it: refresh rate > worst-case panel response = blurry mess. Overclocked IPS is not the answer; especially not for games which often benefit from high contrast. EIZO has the right of it; MVA is the way to go with LCDs for gaming at this time. Especially with newer games pushing ever greater levels of detail; chroma resolution is also resolution! [sub<]insufficient contrast causes you to lose chroma resolution [/sub<]

      • Damage
      • 6 years ago

      Yeah, you’re right. IPS isn’t the only way to go. I shouldn’t use “IPS” as shorthand for “quality/not TN panel.” Heh.

      That Asus monitor’s TN panel is not nearly as nice as the ones in the Dell 22″ monitors I have here, though. They have less light bleed, better color and contrast, and everything else except speed. Specs are one thing, but seeing is another. 😉

        • superjawes
        • 6 years ago

        Video quality has two components: image and animation. IPS has better image quality while fast TN displays do better in animation.

        It would be nice to get both at an affordable price, but I’m not holding my breath.

        • LastQuestion
        • 6 years ago

        Eizo QC insures their panels have no BLB, arrive with the advertised contrast ratio, and come with an accurate factory calibration. Their input latency is also under one frame.

        • auxy
        • 6 years ago

        Ehh … well a lot of that has to do with the monitor construction quality … but I know what you mean… I guess. My VG248QE looks a LOT nicer (in terms of colors and contrast) than my VS229H-Ps, which are eIPS. They just have the perfect viewing angles.

      • briskly
      • 6 years ago

      VA isn’t perfect either, it’s still really really slow in dark transition scenes, and loses that shadow detail viewed head-on. Rather unfortunate as you normally would look to a VA to provide superior black level. The foris still seems very nice FWIW.

      Contrast ratio is one of those things that has gone to the wayside with LCD adoption, and it’s a damned shame it has.

    • jessterman21
    • 6 years ago

    Great write-up, Scooter. Looking forward to those cheapish 60Hz IPS 1440p G-sync monitors or better yet – a QHD Oculus Rift with G-sync…………. <salivating>

    Quick note on my own testing recently:

    I’ve set up most of my third-person games with Adaptive Vsync because my i3-2100 can’t run many games on High-Ultra above 60fps consistently, but playing them with a wireless X360 controller on my TV introduces way too much lag for most games (HDMI+TV+controller+Vsync). I hate tearing, but it felt like 100-200ms lag at times.

    I’ve found on my 60Hz display that locking my framerate at 45fps reduces tearing visibility pretty well – I believe it’s because the tearing is at regular intervals, interspersed with whole frames (tear at top third, no tear, tear at bottom third, repeat). Motion-blur helps hide it even more. I know most are used to higher framerates, but can anyone with a 60Hz panel check this out, to indulge my curiosity?

    Probably the closest I’ll get to G-sync for a while.

      • mdrejhon
      • 6 years ago

      It’s correct tearing is more visible at framerates near refreshrates.

      — Harmonic frequencies between framerate and refreshrate, can create stutters
      — Stationary and rolling tearing artifact can also occur in this situation.

      That said, 45fps@60Hz might still cause some harmonic effects with tearing, it’s now too faint and is hidden by your motion blur, it will likely create a more regular microstutter that looks better than 43fps@60Hz or 47fps@60Hz.

    • Klimax
    • 6 years ago

    Hopefully that 24” 4K Dell panel will have this…

    • Ryhadar
    • 6 years ago

    G-Sync, super fast refresh rates, multi input touchscreens, a wide array of panel types, etc.

    I may be alone here, but after seeing LCD manufacturers ride the “Look how many more pixels we have!” train for years the monitor space is starting to finally get interesting for me.

    Great write up, by the way. I’m looking forward to G-Sync working on other other GPU vendors but when they do I’d like to pick up a G-Sync monitor.

    • windwalker
    • 6 years ago

    Thank you for the very nice write-up, Scott.
    It’s still unclear to me though if this technology is a must have for the average gamer.
    For a budget limited build is it better to pay more for a Gsync display or to buy a more expensive graphics card to sustain 60 FPS and use Vsync?

      • Chrispy_
      • 6 years ago

      G-Sync requires an expensive, specific monitor which has a cost delta over a non G-Sync monitor high enough to buy a decent GPU alone. For budget builds you need to spend that on a decent GPU instead. The options of say a GTX770 instead of a GTX650 + G-Sync monitor are comparable in price but not in performance.

      If you can afford a GPU that achieves a minimum of about 40fps at all times, then (and only then) does G-Sync become an enticing alternative to gaming at 120Hz with Vsync.

      My priority for budget decisions for gaming would be something like this:
      [list=1<][*<]Hardware that can provide a minimum of about 40fps at all times without vsync[/*<][*<]Hardware that can provide a minimum of 60fps at all times with vsync[/*<][*<]Hardware that can provide a minimum of 40fps at all times using a G-Sync monitor[/*<][*<]Hardware that can provide an average of over 90fps, enabling 120Hz and 60Hz fluidity with vsync.[/*<][/list<] The 40fps figure is pulled from general agreement that most people experience fludity of motion at around 40Hz (which is very different to 40fps on a 60Hz screen). 40fps@60Hz without vsync is good enough for fluidity but necessitates tearing. As you spend more money you can aim for the first step to improved fluidity which is to hit 60fps constant and enable vsync. The next step up from vsync is G-Sync but just like vsync it needs a minimum framerate to have any real effect. You should aim for a high minimum framerate first, and once you have that hardware, then consider better vsync solutions to reduce latency and tearing. I often wondered how hard it would be to use adaptive vsync - where vsync is on for above 60fps but it allows incomplete frames (vsync off) if the GPU can't make 60fps until the framerate returns to a level high enough to refill the buffers with complete frames again.

      • jessterman21
      • 6 years ago

      Chrispy’s right, as long as you have a setup that can maintain 40fps minimum, G-sync is worth it. 60fps+Vsync guarantees an extra 17ms of input lag, which is too much for fast-paced games, IMO.

        • Pwnstar
        • 6 years ago

        Which is why Vsync should never be enabled.

          • Meadows
          • 6 years ago

          Give me a hug, brother.

      • Damage
      • 6 years ago

      It’s way too early to say yet. I don’t know what your budget is or what graphics card you have now. Heck dunno what monitor or CPU, either. Even if I did, the only display that offers G-Sync is kinda bad in most ways other than speed. This is just a preview of a tech that should be coming in products later this year. You’ll want to wait until more products arrive before pulling the trigger, I think.

        • windwalker
        • 6 years ago

        I didn’t mean just for me and just for right now, but as a guideline.
        To add the budget perspective I could ask at which levels of the buyer’s guide would Gsync be too expensive, optional, recommended or required.

        As an example, Crispy_ put it in a very clear priority list in his reply above.
        Do you agree with it?

          • Voldenuit
          • 6 years ago

          Perhaps for a gaming-oriented build, G-SYNC could be a higher priority, or as an option (with caveat of Kepler+ card).

          The product[s<]s[/s<] right now are too embryonic for G-SYNC to be a recommended option for most people, IMO. As Scott alludes to in the article, I'd like to see cheaper monitors with higher-quality panels, and I still have unanswered questions about how the lack of hardware scalers and comb filters will affect compatibility of G-SYNC monitors with consumer electronics and video sources (I use my PC monitor for TV/video output). This will hopefully pan out as second generation monitors appear with ASICs instead of FPGAs, and this would also resolve some of the missing components in the current FPGA circuit board.

      • HisDivineOrder
      • 6 years ago

      I suspect the idea is you buy a Gsync display that ONE time you buy a display in the next 5-10-ish years (I mean, I don’t buy displays that often) and GPU’s are updated every 1-3-ish years.

      I’d imagine a beancounter somewhere inside of nVidia (perhaps The Alpha Beancounter, the CEO) did an analysis and realized they might lock users into only nVidia no matter what if they could get someone to buy into the nVidia-exclusive gsync for their next monitor upgrade, which should be coming up for a LOT of users who skipped 1440/1600p while using 720p/1080p. Those users would be aching to move to 4K and beta testing Gsync now to have it ready for when 4K is ready for gaming and enthusiasts and anyone looking to buy for less than 1K is probably wise.

      So nVidia does their beta now and by next year, they have gsync far cheaper and they’d like to think more commonly adopted. When the 4K low(er) cost craze hits, Gsync will be in place to win some gamer-centric monitor contracts about then and you’d have nVidia users for the next 5-10 years (or longer?) since the monitors will push them to nVidia hardware.

      It’s a cunning plan. I think it’s a shame Gsync was not an open standard, but I also happen to think this is JUST the kind of thing Displayport ought to have built in, especially with it being such PC-centric port anyway.

        • windwalker
        • 6 years ago

        It’s fine by me as long as they manage to get sufficient design wins to cover the gamut of preferences for displays.
        This whole “open standards” issue is total BS.
        Nvidia did plenty of work to develop Gsync and now you expect them to deliver it on a silver platter to their competitors. No ****ing way.

      • wingless
      • 6 years ago

      Yes it is a must have. Trust me. When this tech makes it into the regular monitors you MUST own one. Seeing is believing.

        • windwalker
        • 6 years ago

        Well, I did see the videos attached to this preview and the differences between Vsync and Gsync were quantitative, not qualitative.
        If the Gsync addon is as expensive as it seems it will be, it may very well not be included in many regular displays but mostly in those targetted at gamers.
        I’m not going to pay a premium for a TN display regardless of Gsync or refresh rate.

    • dragosmp
    • 6 years ago

    A more immediate benefit should be (I hope) the rise of 85+Hz monitors. Those old enough to own CRTs probably remember very well that 60Hz was a headache-inducing mess, 72 was better and 85 was good. “Humans can’t see more than 60Hz” is utter crap, the only reason why it’s acceptable is the lack of visible flicker for LCD vs the very visible flicker of CRT.

    G-sync is the cherry on the cake

      • TAViX
      • 6 years ago

      Man, the refresh rate for LCD and CRT IS NOT THE SAME THING! Even with 30hz, you don’t have flicker on LCDs because of different technologies. And you are also confusing Hz with FPS…

        • auxy
        • 6 years ago

        I’m about to blow your mind.

        Hertz (Hz) refers to repetitions per second.”Cycles”, if you will. It can be used to describe anything that happens ‘per second’. Like revolutions. Or [b<]frames.[/b<]

          • thor84no
          • 6 years ago

          And crucially, what a CRT does and an LCD does in those “cycles”, are completely different and hence have different effects. They’re simply not comparable in their visual effect based on refresh rate alone. CRT *needs* a much higher refresh rate because the way it refreshes lines is horrific and headache inducing.

            • dragosmp
            • 6 years ago

            @thor84no: maybe, in a way. I’m not saying LCDs flicker, they obviously don’t hence the illusion of fluidity is easier to accept. I think we agree that since we talk about refresh rate we’re talking about a discrete phenomenon that needs to happen fast enough as to give the illusion of a continuous phenomenon.
            My point is 60 Hz isn’t always fast enough. It is if you’re using word, it isn’t if you’re doing battling in Skyrim. That’s why graphics card should be able to command high FPS (Hz) to give the isslusion of continuous movement, as well as low FPS to save power if not needed when typing as I do now

            @auxy: awesome post 🙂

          • Voldenuit
          • 6 years ago

          Shhh! You’ll make his brain Hertz.

    • davidbowser
    • 6 years ago

    Love the innovation and disruptive tech, but hate to see it be Nvidia only.

    Going a little off here, but either the standards (60Hz refresh, 29.97 FPS, etc.) need to change with a new variable rate standard, or someone like Nvidia is going to create a proprietary method that is more attractive. I realize that much of this is based on legacy tech (CRTs), but there is certainly enough momentum for a break and non-backwards compatible standard.

    Imagine a movie that could be “filmed” at a variable speed where fast and irregular motion is captured as smoothly as life, but slower fixed scenes could be lower speed (bit rate). Virtual reality-type gaming would be a good start, but I see this type of breakthrough having much broader implications than video games.

      • Meadows
      • 6 years ago

      Your idea is not very good because your eyes notice changes in refresh rates very well, so the camera would not only need to have a minimum recording rate, but it would also have to be very high at that. Perhaps not even the new fad of 48 fps would be high enough for a minimum.

      I could imagine variable frame rate films only if they’re something between 60-120 or 60-240 fps, potentially with the extra frames blended only in post-production. That would be very expensive to do, we should be happy if a fixed 48 fps catches on at all for the time being.

        • davidbowser
        • 6 years ago

        I think you are making an erroneous assumption on what a “minimum recording rate” would be. You seem to be thinking of standard today rather than what is possible. When discussing innovation, it is common to speak in terms of future rather than present, so fixing what is “expensive to do” is part of the innovation process. My example of a movie, was just that, an example. I also mentioned VR gaming, so I am not focused on films, but the supporting tech. I purposely made no mention of what a “lower speed” would be because it is irrelevant in today’s terms. For example, frame rates for movies have been the same (between 24-30 fps) for nearly 100 years, so the innovations (and corresponding standards) have simply not been adopted.

          • TwoEars
          • 6 years ago

          Just make it 4k and 100Hz, why hold back?

          If they implemented some clever loss-less compression I don’t think it’d end up that much bigger than today’s blue-ray discs.

            • Voldenuit
            • 6 years ago

            The Hobbit was 4K and 48 fps, and it looked like junk. The motion was stuttery, and the super duper high resolution meant that you could see every imperfection in the props, and contact lenses in Gandalf’s eyes.

            Sometimes, suspension of disbelief requires you to see less, not more.

            Not saying that there shouldn’t be progress in film recording, just that it’s going to come with teething pains, just as color and sound did to existing films of the day.

            • Meadows
            • 6 years ago

            That they made it look like junk doesn’t mean the underlying technology itself is bad. There exist the methods to hide all those things.

            • Voldenuit
            • 6 years ago

            Hence my qualifier that it was teething pains.

            But that also means that ‘moar rez’ and ‘moar fps’ is not a panacea.

            • jihadjoe
            • 6 years ago

            I think you’re looking at this the wrong way.

            Higher resolution and refresh rate is just a bigger canvas, or the availability of a thinner brush. It’s still up to the painter to make use of it, or not, depending on what sort of picture he wants to paint. Picasso is still free to indulge in his cubism, but without the small brushes we wouldn’t have the hyper realists at all.

            • Chrispy_
            • 6 years ago

            [quote<]Sometimes, suspension of disbelief requires you to see less, not more.[/quote<] Always, I think. The more your own brain has to fill in the blanks itself, the less chance there is that it'll spot something it doesn't agree with.

      • jihadjoe
      • 6 years ago

      IMO it’s not really much of a problem for movies because everything is pre-rendered.

      In an HFR movie they can just duplicate frames to simulate low FR and the accompanyting “film” like feel to it, and then jump to HFR during high motion scenes for maximum clarity.

      • psuedonymous
      • 6 years ago

      [quote<]Imagine a movie that could be "filmed" at a variable speed where fast and irregular motion is captured as smoothly as life, but slower fixed scenes could be lower speed [/quote<]This already exists, is called [url=http://www.youtube.com/watch?v=NkWLZy7gbLg<]Showscan Digital[/url<], and was designed by the well-known Douglas Trumbull.

    • UnfriendlyFire
    • 6 years ago

    Any bets if Nividia is going to keep G-Sync for themselves like PhysX or the version two of TWIMTBP?

      • Klimax
      • 6 years ago

      Unless in particular way patented, VESA should be able to include it in next version of standard. (And in most cases able to avoid any patents)

      • superjawes
      • 6 years ago

      It’s hardware. PhysX and TWIMTBP are software, and require some specific in-game programming that favors Nvidia chips. G-Sync, on the other hand, is more of a display strategy. Once the design is finalized and monitor ASICs are replaced with G-Sync capable ones, all you need to do is figure out how to tap into it over the cable.

      Nvidia could patent the final chip, but even then, they’d just sell it to monitor manufacturers for extra profit of their own. This would slow adoption, but it still wouldn’t prevent someone like AMD from activating it with their own chips and drivers.

      • nanoflower
      • 6 years ago

      I would expect that even if Nvidia intends to open up G-Sync for AMD/Intel or others to use it won’t happen until 2015. They will spend this year fine tuning it and perhaps moving away from the FPGA they are using now.

    • Modivated1
    • 6 years ago

    New articles are out about Nvidia’s Black box Syndrome. Basically creating code that developers can apply to their games but cannot customize the code due to the code being locked away where the developer cannot see it. Therefore game errors that exist because of the Black Box Syndrome cannot be optimized by anyone accept Nvidia. Nvidia waits until the game is released and the optimizes the code for Nvidia cards. So as to make consumers think that AMD is not doing their job with drivers.

    However, AMD cannot optimize because like the Developer they cannot get to the code. So if you are playing a TWIMTBP game then you will likely see Nvidia advantages.

    Hmm, I am surprised to see that there is no article or link to an article on this site. I wonder why that is?

    2nd EDIT: The comment above is a below the belt insinuation, Forum Community of TECHREPORT Please give me for my rudeness.

    EDIT: Here is the details on more of Nvidia’s Proprietary Strategy.

    [url<]http://www.extremetech.com/extreme/173511-nvidias-gameworks-program-usurps-power-from-developers-end-users-and-amd[/url<]

      • ronch
      • 6 years ago

      Shady practices like this turn me off. This is why I don’t like Intel and have always gravitated towards ATI/AMD for graphics. Not saying AMD never did this sort of thing but with Intel and Nvidia, they seem to be more used to doing this.

      • Klimax
      • 6 years ago

      Repeated stupidity isn’t magically changed into non-stupidity.

      There’s not much correct about whole thing, including assertion that AMD cannot optimize. They can, it might be bit hard, but they can. See Global Illumination…

      Entire premise is wrong.

        • Modivated1
        • 6 years ago

        Go to Youtube and type in ” Encapsulation ” and find a programming tutorial that will explain it to you. The purpose is to deliver information to a user or requesting program without showing how the process was done. It also locks anyone out from making changes to the program. That is it’s intent.

        Are there ways around it, yeah, but the comparison is like trying to get to a destination by driving down a flat road through a valley vs being on foot scaling through a Mountain. The time and effort it takes would not likely be profitable. By the time they figured it out we would be on the next generation of GPU’s.

      • Damage
      • 6 years ago

      You’re way off topic.

      The reason we don’t have an article about that story is because we’re not finished researching it ourselves. We’re learning the truth may be more complicated than what’s been reported, but we need to know more before drawing any conclusions.

      Thanks for the accusation, though. Really brightened my day.

        • Modivated1
        • 6 years ago

        Actually, I there was a post that this was in response to talking about proprietary actions, I just put it up in a orginal post so that people get more views of it. However anyone who has even taken a Object Oriented class on programming knows about encapsulation and why you do it. If it true then this is definitely a shame on Nvidia.

        Yeah, your are right it was a cheap shot below the belt. I am sorry for jumping to conclusions like that.

      • HisDivineOrder
      • 6 years ago

      Somehow, I suspect while you’re criticize nVidia for creating code that is not optimized for AMD hardware, you’re fine with Mantle, right? 😉

        • Modivated1
        • 6 years ago

        Nvidia’s current gen GPU is not designed to take advantage of Mantle but Nvidia is not locked out of being able to use Mantle. Nvidia’s Maxwell chip could be configured to use the Mantle API if they wanted to.

        Can you say the same about this BlackBox Encapsulation? No.

          • Meadows
          • 6 years ago

          Do they pay you well? The red guys?

          • Deanjo
          • 6 years ago

          [quote<]Nvidia's current gen GPU is not designed to take advantage of Mantle but Nvidia is not locked out of being able to use Mantle.[/quote<] You can provide open documentation to back that up right?

      • YukaKun
      • 6 years ago

      Or if you’re an AMD video card user, you can download RadeonPro and use a different video card ID so games made for nVidia think you’re using one.

      It’s still a cheap shot from the green team, but Jen doesn’t give a crap about fair competition or open standards. Not even nVidia fanbois will change that, the only thing that can change it is lawyers or poor sales.

      Cheers!

      • Voldenuit
      • 6 years ago

      That article makes a lot of assumptions and accusations without much to back their theory up. They’ve also had to publish retractions on some of their claims.

      I mean, how could two completely different developers (Rocksteady and WB Montreal) working with same engine produce different results? Shock! Horror! Alert the presses! There’s a conspiracy afoot!

      /sarcasm (sarcasm tag added for clarity).

      If there [i<]is[/i<] a conspiracy, then I would like to see it outed. But to expect Scott and Co. to publish a hasty and unresearched "me too!" article just to jump on a high profile bandwagon is disrespecting the journalistic integrity of this site. EDIT: grammar and spelling

        • Modivated1
        • 6 years ago

        It’s far to late for them to jump on the bandwagon this has been out for at least a week and I have found it at multiple sources. To their defense they stated that they hadn’t published it because they were researching it.

        Look at Damage’s response to my original post.

    • Sargent Duck
    • 6 years ago

    [quote<]and found myself pondering another play-through[/quote<] Psssshhhh, I'm on my 6th... Axton (lv 72) Maya (lv 50) Salvador (lv 32)

    • alienstorexxx
    • 6 years ago

    i’ve haven’t read all the article yet but i think i’ve read enough to post this, what is the problem of talking about triple buffering vsync?
    i’m amazed that people that are always looking further on things, like fps-frame pacing, doesn’t want to take a look wich is very important if you talk about vsync.

    weren’t you the ones who said “fps doesn’t tell the wohle story” or something like that?

    well those nvidia graphs (curious) doesn’t tell the whole story neither. why stick with them (besides you’re lazy as you said), at least say that those images are partially correct.

    if this is just a “first look” as the title says and you want to make a full article later, it’s ok. but 4 pages about this but videos and all that, i’ve seen purportedly deeper articles going 4 pages.

    “You’re better off spreading your pixels per second across successive frames than jamming them all into really tight pixel pitches, honestly.”
    new war zone. amd wants to go 4k, nvidia wants to go 144hz.
    and everyone goes fully retarded on another t-shirt fanboy discussion.

    out.

    -wow already? too much haters, maybe i s h o u l d D I E E.. –

      • wuzelwazel
      • 6 years ago

      Well, I *have* read your entire comment and I still have no idea what you said.

        • alienstorexxx
        • 6 years ago

        that’s pretty much because there’s nothing about this on the article 😉

          • Damage
          • 6 years ago

          I do mention buffering in the article, right in my explanations. I’ll admit I didn’t talk about triple-buffering extensively out of a desire to keep the explanations from being even more complicated, but your apparent suggestion that the article gives the wrong impression is simply off.

          Even games that buffer multiple frames have slow-downs that interrupted a triple-buffered stream of frames sufficiently to cause quantization. Look at the quantization in Skyrim, for instance. The video shows it happening with vsync enabled, and I’m fairly certain that game is triple buffered. Proof here in divergent FCAT vs. Fraps results:

          [url<]https://techreport.com/review/24553/inside-the-second-with-nvidia-frame-capture-tools/5[/url<] I'm not sure what exactly you think is missing here or why it matters. I'm happy to discuss in more detail, if you'll moderate your tone and speak clearly. Thanks.

            • alienstorexxx
            • 6 years ago

            those slowdowns are nowhere far from reality, if there are it has to do with gpu performance, not with the buffer. if the frame gets the given time to reach the monitor refresh, it will be shown, if it not, it won’t, on worst case you will be 1 frame late to the party (or 16ms?). the difference between that and with g-sync is that you can be half a frame soner or 8ms, or anything between 16ms and 0ms.
            correct me if i’m wrong.

            i don’t think that this level of quantization gives the amount of “badness” that it should give to make worth an artifact that makes gaming more expensive. sorry but i’m not going to add something else to my gaming budget.

      • jessterman21
      • 6 years ago

      triple-buffered vsync introduces up to 3x input lag

        • alienstorexxx
        • 6 years ago

        double at max. gpu does not render frames from the past, it’s just monitor showing a single frame later than it should.
        triple buffering means that while 1 frame is being showed up, and the next is buffered, the 3rd can be already rendered even if it isn’t going to show up in the next frame, there is your 3 frames. you’re not looking 3 frames in the past, just 1 when that buffered frame does not reach the 16ms line. on any average non multiplayer fps game you don’t even notice.

        • Entroper
        • 6 years ago

        This is a common misconception. There is a difference between doing triple buffering and lengthening the display queue (which is also commonly called a “buffer”, adding to the confusion). Triple buffering is a latency-[i<]reducing[/i<] technique. It employs two back buffers and switches between them (they are not strictly ordered, as they would be with a lengthened display queue). Say buffer F is on the screen, and the GPU is drawing into B1. If the GPU is working faster than the screen, then it can begin drawing in B2 before the buffer swap occurs. If it continues to work faster, eventually it will get far enough ahead that it actually completes two frames before the next buffer swap. In this case, the first frame that was completed is dropped, and the second one gets shown next, i.e., the one that was started closer to the current time (and with less latency). The maximum latency is 2 frames' worth, the same as vanilla vsync, but the average latency is significantly less because the GPU can get ahead. The ONLY penalty is the extra buffer space in video memory, which is pretty insignificant with GB of RAM on GPUs these days.

          • jessterman21
          • 6 years ago

          Well, you learn something new every day.

      • Pwnstar
      • 6 years ago

      Maybe you should.

        • auxy
        • 6 years ago

        I GOT BANNED FOR THIS

          • Bensam123
          • 6 years ago

          I love you auxy.

          • Pwnstar
          • 6 years ago

          You got banned? =(

            • Airmantharp
            • 6 years ago

            She does disappear periodically for lengths of time…

    • Krogoth
    • 6 years ago

    Interesting, but gimmicky technology. If you need slow-mo capture to notice a significant “difference” then it is not worth worrying about unless you are one of those anal-retentive videophile types that must have perfect image fidelity no matter the cost.

    Tearing is over-exaggerated. You only notice it when you pan the screen around like an ADHD, caffeine junkie or you are having some kind of seizure. Even if these cases, it is a you blink and you’ll miss it type of deal.

      • alienstorexxx
      • 6 years ago

      i agree with the first paragraph. not really impressed.
      frame pacing problems are noticeable. this is not.

      tearing is really a problem, but i don’t think an expensive exclusive technology has to fix it.

      • Damage
      • 6 years ago

      Slow motion isn’t necessary to appreciate the benefits of G-Sync, but it is kind of necessary to capture those benefits and play them back with conventional, fixed-refresh video cameras and displays. That’s why I used slow motion videos.

      • wuzelwazel
      • 6 years ago

      So I hate to rain on your smarty-pants parade but the reason they used slow-mo capture was not to slow the footage down so that you could notice the difference. The reason they used slow-mo capture was because, well, your monitor sucks. You can’t watch a real-time video play back at a variable frame rate because your monitor is stuck on a steady refresh cycle… basically this is your fault, all your fault.

        • Krogoth
        • 6 years ago

        LCDs are the problem. They aren’t fast enough for fluid, ultra high-end FPS. Speed has never been their strength. Crystal layers are slower at shifting between colors than electron gun found in CRTs. TNs are the fastest LCD panels, because there are less layers that it has work with. Back lightstrobing TN panels are still no match for a quality CRT.

        G-sync (hardware) and Vsync/buffering (software) only fixes the syncing problems on the data-front with the RAMDAC/TMDS on video card and monitor in question. The most common symptom being “screen-tearing” when you pan around rapidly. They do nothing to address the shortcomings of the monitor’s technology.

          • Airmantharp
          • 6 years ago

          Why are you complaining about the problems that LCDs have in a discussion about technology that fixes a long-standing display problem?

          And if you look at it another way, LCDs (and OLEDs) actually allow us to fix this problem easily; with CRTs it would have been quite a bit more difficult!

          • mdrejhon
          • 6 years ago

          Wrong. Actually, backlight strobing is a light-output issue, not a clarity issue.

          The problem is the strobe lengths (1ms-2.5ms) are longer than a short-persistence CRT (<1ms). There is not enough light output in the backlight in order to shorten the strobes.

          Several newer TN LCD’s are already able to transition pixels fast enough to be virtually finished before the next refresh cycle. Motion blur on a good strobe-backlight LCD is dictated by the strobe length, and not by the LCD pixel transition speed. Pixel transition speed limits show up as a faint crosstalk effect (like 3D crosstalk, but as a chasing ghost). On the better LCD’s, the crosstalk is not visible.

          Since I test these for Blur Busters, I have all 4 monitors (several LightBoost monitors, an Eizo FG2421 with Turbo240, a GSYNC monitor NVIDIA sent, and a beta BENQ XL2720Z with BENQ Blur Reduction, that BENQ sent) .

          Using my oscilloscope, here are the strobe flash lengths:

          — LightBoost. Strobe flash length 1ms to 2.5ms. (usually 1.4ms to 2.4ms range)
          — EIZO Turbo240, found in FG2421. Strobe flash length 2.3ms
          — ULMB (Ultra Low Motion Blur) on GSYNC monitors. Strobe flash length 2.0ms.
          — BENQ Blur Reduction in Z-Series (XL2720Z, XL2411Z, XL2420Z). Strobe flash length 2.0ms

          So you can see, the strobe lengths mimic medium-persistence CRT phosphor.
          We need future strobe-backlight monitors that have shorter flash lengths (e.g. 0.5ms strobe flash, once per refresh, and shorter), without the screen becoming dark. We also need ability to strobe at lower refresh rates (e.g. strobing at 75Hz), since LightBoost is like a CRT forced to only work at 100-120Hz. Motion on Impulse-driven displays always looks best at stroberate=refreshrate, and sometimes GPU’s are not powerful enough to allow 120fps@120Hz.

          The side effect of slow LCD pixel transitions are already negligible on ULMB and the newer (1ms) LightBoost monitors, while it is more of an issue on the Eizo Turbo240. It shows up as a faint crosstalk between refreshes (<1% intensity — sometimes only 1 color off — much like greyscale 254 versus greyscale 255) since the black period hides most of the LCD pixel transitions now. This is lost in the visual clutter of game motion. Motion clarity noticeably improves with shorter strobe lengths (Example: LightBoost 10% versus 50% versus 100% when viewing fast-panning motion — shorter strobe lengths are pretty noticeably clearer).

          My tests confirm that motion clarity is bottlenecked by light output because the shorter you flash, the darker the picture becomes. CRT phosphor shine insanely bright (as much as ~5000cd/m2) for the short duration of illumination, often for less than 1ms. LED backlights can’t shine that brightly, so strobed monitors compensate, by using a persistence compromise (~2ms flash)

          Laboratory tests have already shown that there is no limits to motion clarity on strobe-backlight LCD’s — once manufacturers build in brighter strobe backlights, they can flash the backlights more briefly, to mimic a shorter-persistence CRT. Tests confirm that motion clarity is proportional to persistence (1ms of strobe length translates to 1 pixel of motion blurring during 1000 pixels/second motion).

          Millions of dollars of engineering went into LCD’s that finally could finish refreshing before the next refresh, because of something called stereoscopic 3D. During 2010-2011, they did not do a good job, but during 2012-2013, with the advent of strobe backlights, finally, 3D crosstalk fell almost below human detectability threshold (on some of the best LCD’s), and this automatically ended the era of “pixel transitions are the motion blur limiting factor”. So your talk is silly.

          The motion clarity limitations is no longer caused by the LCD pixel transition speed, once most pixel transition is complete before the next refresh cycle. The LCD pixel transition can be slow, but as long as it’s in total darkness, and the pixel transition is practically finished (>99%+) before the next strobe. When inter-refresh crosstalk is this faint, it gets lost in the noise floor of detailed imagery, while the motion clarity stands out (e.g. fine details during fast panning motion). The pixel transition speed became a moot issue once the inter-refresh crosstalk fell below a threshold — because motion clarity is dictated by strobe flash length. And strobe flash length can be shorter than pixel transitions. The strobe flash just has to be timed on the clearest, fully refreshed LCD.

          I, and several others, confirmed this. Do you have an oscilloscope, photodiode, and a high speed camera like I do? Do you have all four under the same roof: Eizo Turbo240, BENQ XL2720Z Blur Reduction, LightBoost, and GSYNC’s ULMB — like I do? Thusly, I call-out your pixel-transition-speed limitation myth (at least for newer LightBoost and the ULMB models), since some of those models actually push the inter-refresh crosstalk finally below human perceptibility thresholds for nearly all combinations of GtG transitions: Which means darn near completely clean refreshes for strobing.

          Given a sufficiently bright backlight, LCD can be made to have far less motion blur than even short-persistence CRT’s. LED’s can be flashed very fast, so it’s a matter of cost (engineering enough LED brightness) in order to achieve a sufficiently-bright ultra-low-persistence strobe backlight (Say, 0.1ms for starters).

          But you are right, CRT’s still produce great colors and unbeatable blacks. No argument.

            • Voldenuit
            • 6 years ago

            Voldenuit is impressed.

            • Krogoth
            • 6 years ago

            So you are saying that CRTs are faster, but a properly calibrated TN panel can be almost as good. Good to know. It is another matter for LCD manufacturers to go the extra mile on this.

            • jihadjoe
            • 6 years ago

            pwned.

            • TwoEars
            • 6 years ago

            That is a lot of words.

            Are you trying to score a job at the techreport?

            • Meadows
            • 6 years ago

            I guess you don’t know him. Display tech is his hobby and he has a website set up for it.

            • TwoEars
            • 6 years ago

            Interesting, I’ll have to re-read his respone in detail then.

          • psuedonymous
          • 6 years ago

          [quote<]Crystal layers are slower at shifting between colors than electron gun found in CRTs[/quote<] There's all [i<]kinds[/i<] of things wrong with what you just said. CRT electron guns don't 'shift colours', there are three separate electron beams in a CRT, one for each colour of phosphor (directed to the correct phosphor by the shadow mask or aperture grill). LCDs do not contain 'crystal lasers'. NO display technology contains crystal lasers*. LCDs aren't even an [i<]emissive display technology[/i<], they work by [i<]blocking[/i<] light from a backlight. *a few exotic scanning-beam projectors (rather than fixed laser light source in place of a filtered broadband source or LED source, as in some microprojectors) exist, but those use diode lasers in order to switch at the required ultra-high frequencies, due to 'varying' beam intensity using PWM like in Plasma displays, rather than in an analog manner. If you wanted to use a 'crystal laser' (I presume you mean one using a discrete lasing rod) then you'd have to use a constant output beam and add a separate light modulator. Which would usually end up being a liquid crystal or a DMD.

            • Krogoth
            • 6 years ago

            Where do you get lasers from?

            “Shifting colors” is describing the effect not the exact process on how LCD and CRT display color on their panel. I understand quite well on how they work. LCDs are *slower* at shifting between black to whites than CRTs. However, LCDs are good enough for vast majority of the people(not many people have eyes trained enough to notice the difference) and content that exists out there (ultra-high FPS animation is a niche).

            LCDs replaced CRTs in the mainstream market, because customers are willing to overlook their drawbacks for their other more noticeable benefits like energy consumption, volume/mass and perfect screen geometry.

            • Meadows
            • 6 years ago

            He said layers, not lasers. Good grief, you.

            • sweatshopking
            • 6 years ago

            I love you guys.

      • MathMan
      • 6 years ago

      You’re very much entitled to dislike LCD technology and cheer for alternatives, but, right now, that’s the best technology we have in terms of price vs. performance. You are probably waiting for OLEDs, but they are simply not ready for mass production at a price that we’ve come to expect.
      As for CRTs: they are great to reduce motion blur because they only show a pixel for a short time before going back to black, but that very characteristic prevents them from varying the refresh rate all the time or from refreshing at a low rate because they will flicker.

      The funny part is that G-SYNC is exploiting the one particular characteristic that a CRT doesn’t have: it holds on to its current value until it gets repainted. This makes it a natural to change the refresh rate at will. The only reason they didn’t do this is because of technology makers being stuck in a CRT frame of mind.

      What remains then is whether or not it is a gimmick or not: you clearly seem to think it is. That’s fine, but it is a subjective judgement. However it is undeniable that syncing a monitor to a variable rate source is a smarter thing to do than the opposite, from first principles. And from the reviews that I’ve seen, whether from professional reviewers or those who have modded their monitor, I’ve yet to see to see the first one that’s negative.

      You and I haven’t had the chance yet to try it, but maybe following first principles isn’t a gimmick but simply the right way of doing things?

      • Bensam123
      • 6 years ago

      This is how I play games – rhetlin and caffeine.

      • Airmantharp
      • 6 years ago

      You know, you should probably play more games. Try a Battlefield game, any of them.

      Tearing is constant, even running in a straight line, but it’s worst when flying where you really need the clarity. And no, it doesn’t take frantic panning to see tearing in any old FPS; ANY panning is going to make it stick out like a sore thumb.

        • Krogoth
        • 6 years ago

        It takes that much crazy panning to make it noticeable. That’s the entirely point. The vast majority don’t care enough to make a fuss about it. It is the hardcore videophiles who are making most of the noise about it.

        I’ve played countless hours on fast-paced FPS games (Quake franchise, TF2, UT franchise). It is the first place where you will notice the screen tearing and other artifacting. Unlike you, I don’t make a big deal out of it. It is a if you blink for a moment, you’ll miss it.

        Frame-rate dips and general shuttering on the other hand is very noticeable and impacts gameplay experience.

          • derFunkenstein
          • 6 years ago

          Dude, just playing SC2 with vsync off, I see plenty of tearing. Just click around on the mini map a bit (like someone who plays at a fairly high level) and you’ll see the main viewport tear like balls.

            • Krogoth
            • 6 years ago

            Pretty much the same deal with someone who is dope up on stimulants/adrenaline playing any of the fast-paced FPS at high level. Your point?

          • MathMan
          • 6 years ago

          If frame rate dips and general stuttering are very noticable to you, then G-SYNC seems to be just the ticket for you. 🙂

            • Krogoth
            • 6 years ago

            G-Sync does little to remedy it. It only helps with syncing issues related to data being transfer from the RAMDAC/TMDS of your video card to the mointor. Screen Tearing is when there’s asyncing going on. It is mostly happens when the video card is sending data faster than what the monitor can handle.

            Frame-rate dips and shuttering problems are the realm of processing power found in your GPU and CPU combo.

          • wuzelwazel
          • 6 years ago

          So essentially you’re saying that people are used to a crappy experience and don’t know any better so they shouldn’t get any better. Is that about right?

            • Airmantharp
            • 6 years ago

            That sure sounds like where he’s coming from. I can see how that perspective makes where I’m coming from a little difficult. I’d pay the $100-$200 for an upgrade module for my ZR30W in a heartbeat!

            G-Sync at 2560×1600@60Hz would be glorious. I’d even make videos for you TR regulars!

            • Krogoth
            • 6 years ago

            No, it is not a “crappy” experience. It is called not being anal-retentive about some tiny crap that’s barely noticeable in most conditions. I can still find aliasing and other imperfections in texture and model rendering even when I crank up AA/AF. Do I make a huge fuss over it? Nope, I have live to deal with it. Most of the world is in the same camp or simply don’t care. You play games to have fun not to be some kind of art gallery.

            It is the same kind of non-sense that audiophiles pull when they are convinced that you have to get special cabling and wooden knobs to get the true aural experience of your soundtracks. Anything else is just a waste of time.

            • MathMan
            • 6 years ago

            With the audio thing, you can use science and measurement to explain that it’s BS.

            With G-SYNC, you know from first principles that it’s the right way of doing things.

            With the audio thing, you have a few nutcases who claim to hear the difference. With G-SYNC, every single reviewer, professional or not, writes how much better it feels.

            Your analogy is broken in every way.

            You can object to it being proprietary and to expensive and you have a point. But fundamentally it’s the right way to do it, no argument.

      • yammerpickle2
      • 6 years ago

      Panning around like an ADHD caffeine junkie is sometimes required to stay alive in first person shooting computer games.

      • indeego
      • 6 years ago

      I love you if only for the predictability of having an opinion that pisses TR members off enough to instill downvote rage. Just picturing the nerds angrily clicking the down-thumb in some display of “message-sending” is enough to arise a crooked smile on my bearded face.

      You have my utmost respect.

    • Star Brood
    • 6 years ago

    I’d like to see a side-by-side with 2D lightboost to see what, if any, benefits are to be had. I notice most of the people here are more about IPS than 120hz+, but the real comparison is how the tech fares against existing 120hz solutions. That is: is it worth almost twice the price.

    Nice article, but I think the comparison is showing the weakest monitors VS the best in terms of refresh rate.

      • wuzelwazel
      • 6 years ago

      The author used the same monitor for all comparisons. I suppose there is some talk about IPS in there but it didn’t strike me as an in-depth comparison of IPS vs. TN or anything. It was just a tangent.

      • Firestarter
      • 6 years ago

      Well, synchronization and tearing problems are reduced on 120hz/144hz displays, they’re present just as well as on 60hz displays but the problem is less visible. Personally, I think that is one of the biggest benefits from having a high refresh rate display, because it’s something that you benefit from even when the framerates would suggest that you bought that 120hz display for nothing.

      That said, the problem is still there, and G-Sync would eliminate it for all practical considerations. Going back to a monitor without G-Sync would probably feel just as jarring as going from a 120hz monitor back to a 60hz monitor, where the biggest difference that jumps at you is how much more tearing you suddenly see on the 60hz display.

      One thing that I think you have to keep in mind is that our brains are very good at hiding imperfections in animation: That tearing that exists on 120hz/144hz displays is more or less filtered out of what you see in my experience. The same can be said about 60hz displays: Although the switch from 120hz to 60hz is jarring, your brain will adjust to it after a while and the tearing won’t look nearly as bad as it did right after the switch. I’m sure the same is true when switching from a G-Sync equipped monitor.

      • mdrejhon
      • 6 years ago

      GSYNC strobe mode (LightBoost sequel) is called ULMB (Ultra Low Motion Blur).
      It’s a strobe backlight like LightBoost, but superior, and easily enabled via a button on the monitor.

      Posts on OCN and elsewhere, show that GSYNC monitors also include a LightBoost sequel called ULMB (Ultra Low Motion Blur). The winners of the GSYNC Upgrade giveaways, who got their GSYNC boards early, have scooped the news that NVIDIA wanted to announce during CES.

      As a rule of thumb, GSYNC and strobing does two different things:
      GSYNC — Eliminates stutters/tearing — Better for lower and variable framerates.
      Strobing — Eliminates motion blur — Better for high consistent framerates.

      You can’t combine both simultaneously…. Yet.

    • wierdo
    • 6 years ago

    Looks pretty cool, I hope an open standard comes out of this at some point.

    I have a video card that can support it, but I would hate to get a monitor with this feature and then find that it either limits my purchasing options or I end up with a major feature unused because of a future upgrade path.

    We’ll see where this goes in the next few years, I’m optimistic it’ll sort itself out somehow. Maybe OLED will take off and make this a non-issue, who knows.

    • ronch
    • 6 years ago

    Being mostly a Radeon (and AMD) fan I hope this tech finds its way over to the red team. AMD made Mantle open to Nvidia so why is Nvidia making this tech proprietary? I think Mantle and G-sync are both great ideas that will significantly move the graphics industry forward so creating open standards should really be part of Nvidia’s To-Do list, which isn’t the case. Just look at PhysX and G-sync.

      • Deanjo
      • 6 years ago

      [quote<]AMD made Mantle open[/quote<] No they haven't. Still AMD only, still closed and not public and once again, AMD has not defined "open" anywhere.

        • alienstorexxx
        • 6 years ago

        well, amd has said something about moving mantle to other platforms. and mantle developers are hoping that intel and nvidia give the support to it or whatever they have to do.

        at least they don’t make the kind of moves that nvidia has done with Gameworks.

        [url<]http://www.extremetech.com/extreme/173511-nvidias-gameworks-program-usurps-power-from-developers-end-users-and-amd[/url<] imagine dark souls pc port with that. no one could ever play it on any card of existance ^^

        • Modivated1
        • 6 years ago

        Mantle is not proprietary, Nvidia can adopt it if they want to. There is nothing in Mantle’s design that specify’s AMD hardware.

        This G sync will be interesting when Nvidia does the same. I am not switching my hardware so that I can by one of Nvidia’s monitors.

          • MathMan
          • 6 years ago

          Not being tied to specific hardware doesn’t mean that it magically becomes non-proprietary.

          And I have yet to see the first quote from AMD where they state that they want to open up their proprietary API.

            • Antimatter
            • 6 years ago

            Games developers will not adopted a proprietary API that is supported by a small percentage of gamers. AMD has no choice but to open Mantle for any possibility of success.

            • Modivated1
            • 6 years ago

            Seeing is believing

            Here is an article:
            [url<]http://wccftech.com/amd-mantle-api-require-gcn-work-nvidia-graphic-cards/#ixzz2ke7GbC1N[/url<] There is also a video that states exactly how mantle works and even gives a demonstration. In that video the explain to you that Mantle is not specifically based to one hardware, as AMD will in future generations update their hardware and would have to make a new API if Mantle was hardware specific.I will find the link to it and post it also. EDIT: Here's another article. [url<]http://www.usgamer.net/articles/amds-mantle-and-why-valve-might-want-to-worry[/url<] 2nd EDIT: Nvidia compatibility [url<]http://www.youtube.com/watch?v=qbTY7Aq45IU[/url<]

            • MathMan
            • 6 years ago

            None of which contradicts my point.

            Do AMD say that they want to offer this to Nvidia (quite a bit different than their own future hardware, you now)? Do they say that they want to make it an open API?

            They don’t.

            In fact, when people started to think that they would, Dave Bauman chimed in on another forum and said that these statements were those made by Dice not AMD.

            AMD is not stupid.

            • Deanjo
            • 6 years ago

            Please link me to Mantles open specification so that I may implement it. I can give you links to true “open” API’s. Here’s one for example:

            [url<]http://www.opengl.org/[/url<] I'm sure all the open drivers teams at freedesktop would be glad to have those so they can start developing the needed libraries to support Mantle. Please reply with the needed links as soon as possible as us developers don't like to wait and would like to start incorporating features as soon as possible to support it. Like what happened when the openCL specifications were released.

            • Voldenuit
            • 6 years ago

            Deanjo, if I could +2 you for this post, I would.

            Since I can’t, here’s a +1.

            • l33t-g4m3r
            • 6 years ago

            Mantle shmantle. This is about gsync, and you’re shilling up the comments with off topic advertisements. Nvidia doesn’t even need mantle, because they have superior drivers and Maxwell will include an arm core to do in hardware what amd is doing with a proprietary api. The only useful new feature from amd is true audio, and we all know how much the pc gaming community cares about supporting 3d audio, other than myself.

            • Modivated1
            • 6 years ago

            This is relevant, the subject here is proprietary vs not proprietary and Gsync is the catalyst because it is proprietary technology.

            • Deanjo
            • 6 years ago

            As is Mantle. I’m still waiting for those open specs by the way… until you can provide those, Mantle is a closed proprietary solution.

            • l33t-g4m3r
            • 6 years ago

            While there might be some licensing issue about using gsync on nvidia cards, I don’t really see how there is any real technical limitation keeping AMD from implementing it on their cards, other than adding gsync to the list of vsync options. It’s a monitor feature, not a video card feature.

            Even if AMD never officially supports Gsync, I’m sure one of the 3rd party utility guys might be able to write a workaround. They did it for FXAA. Gsync is no different, perhaps even easier. If this is the case, amd users really have no legitimate complaint about the technology, other than being a poor sport.

            • Bensam123
            • 6 years ago

            No one said it’s not proprietary. Developers can implement Directx, they can also implement Mantle. Just the same Nvidia can implement Mantle as AMD can.

            However, AMD can’t implement G-Sync as Nvidia will sue the pants off of them. Big difference.

            There seems to be a bit of a misunderstanding here, where people seem to think just because something is proprietary it’s tied specifically to a certain set of hardware. DirectX is proprietary and that didn’t stop Nvidia from implementing it. Just the same as they can do with Mantle.

          • Klimax
          • 6 years ago

          Same way as AMD could adopt CUDA.

          • Deanjo
          • 6 years ago

          In that regard, Physx is open too. AMD is free to license it from Nvidia if they want to.

            • Modivated1
            • 6 years ago

            When you have to pay the company who established the technology then that by it’s very definition is NOT open. Intel nor Nvidia has to pay royalties to use Mantle. Thus it is an open source offer. Now they have to do all the leg work to adapt so that they may get the advantages of Mantle design.

            Do not forget that this is a business, Intel and Nvidia Horde their custom tech all the time making proprietary tech. AMD has brought a number of works to the table, tessellation and tress effects to name a few. Now Mantle is on the table but rivals have to do some work to make it benefit them, that’s their problem no one is going to hand you the advantage on a silver platter.

            Nvidia took the advantage with Tessellation and has the better GPU on that tech, if they are worth what all the Green Teams Cheer Leaders say they are then they can do it again and they would have nothing to fear offering Open PhysX and Gsync.

            It matters not, all Mantle has to do now is perform as good as the hype, it’s got enough games to showcase it’s advantages and they have won favor with Developers by answering their plea when no one else would. If people are impressed and by into Mantle then it will become the standard. AMD will just have to get around to building their own Physic chip and Gsync type tech.

            If you build it, they will come!

            • Deanjo
            • 6 years ago

            [quote<]When you have to pay the company who established the technology then that by it's very definition is NOT open. Intel nor Nvidia has to pay royalties to use Mantle. [/quote<] Monitor vendors aren't paying anything to nVidia.

            • MathMan
            • 6 years ago

            In which fairy tale world is AMD giving away Mantle for free to Nvidia and Intel?

            Really, do tell, because right now I have this impression that you’re just making things up for your convenience.

            • Bensam123
            • 6 years ago

            They can’t implement it in hardware. It’s locked specifically to Nvidia hardware. They can’t license it.

            Furthermore, software developers licensing PhysX != another hardware manufacturer being able to license and accelerate it on their hardware.

            • Deanjo
            • 6 years ago

            FYI, at one time, was open to licensing Physx to AMD, AMD declined. Get your facts straight.

            [url<]http://www.bit-tech.net/custompc/news/602205/nvidia-offers-physx-support-to-amd--ati.html[/url<] Secondly AMD has said themselves that Mantle right now only works on GCN hardware right now. That is no different then G-Sync being available on select Nvidia hardware. To get them both running on both hardware it really isn't any different as what has to be done, both would require software adaptations to get it running on both.

            • Modivated1
            • 6 years ago

            Ok, well if a G-Sync Monitor will work on AMD hardware and AMD adapts it then regardless of who produced the tech I will buy it. If I have to convert my system to Nvidia specific hardware then I will pass on G-Sync.

            • Meadows
            • 6 years ago

            Your loss. 🙂

            • Modivated1
            • 6 years ago

            Let’s see better fidelity, smoother rendering, more AI application to moving objects on screen, dramatically less cpu load, full utilization of traditionally untapped GPU power and features, AND better FPS = Mantle.

            No frame taring = G-Sync.
            More cost to convert system.

            My Loss? I don’t think so. Don’t believe my claims, watch this Demo.

            [url<]http://www.youtube.com/watch?v=QIWyf8Hyjbg[/url<]

            • Meadows
            • 6 years ago

            G-sync is here today. Show me some [i<][b<]released[/b<][/i<] Mantle games that fulfill all your claims.

            • Modivated1
            • 6 years ago

            G-Sync is here today? Really? Where CAN I BUY A MONITOR! I see that we are talking on a First Look introduction to G-Sync I don’t see any monitor availability. In fact it looks as if Mantle will be making their public debut this month while G-Sync will arrive in another 4 to 6 months.

            Mantle will be released this month for Battlefield 4

            G-Sync well here is the article printed today about G-Sync’s release:

            [url<]http://www.incgamers.com/2014/01/nvidia-says-g-sync-monitors-will-available-mid-2014[/url<] Without even clicking the Link you can read that G-Sync is NOT here today (MID 2014)! So I guess how good these things are will be known very soon. But until then Folk's, please try not to talk out the side of your neck with faulty claims.

            • Meadows
            • 6 years ago

            The article clearly states which monitor is available today for purchase. Look it up yourself.

            • Modivated1
            • 6 years ago

            NO NEED, AMD has answered the call

            [url<]http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014[/url<] I hate to be a cheerleader (Which I have totally been on this article) but like I said, no loss on my part "FREE SYNC BABIE"!!

            • Modivated1
            • 6 years ago

            No monitor is available, now you can buy mod kits to make you monitor G-Sync compatible but actual monitors will be available in 4 to 6 months.

            • MathMan
            • 6 years ago

            <never mind>

          • Meadows
          • 6 years ago

          Of course it’s proprietary.

        • Bensam123
        • 6 years ago

        You can develop for mantle. Just because it’s proprietary doesn’t mean other people can’t develop for it.

        Case and point. AMD tries to implement G-Sync, Nvidia sues the pants off of AMD. Nvidia tries to implement Mantle, AMD welcomes them with open arms.

    • Srsly_Bro
    • 6 years ago

    Here’s an opp to AMD rant. I’m tired of their driver issues which I’ve never experienced with several Nvidia cars. I’ve owned 4 Nvidia cards and 2 AMD (3 if you count an APU) and the driver issues I experience make me switch to the green side. Where u at Maxwell?

      • ronch
      • 6 years ago

      [quote<]Nvidia cars[/quote<] I didn't know Nvidia makes cars...

        • Meadows
        • 6 years ago

        And finally the letters make sense.

      • BlackStar
      • 6 years ago

      I’ve never had Firefox crash with AMD. I’ve had it crash *multiple* times with Nvidia when Direct2D acceleration is enabled.

      Nvidia is focusing all their energy on games but their cards suck for general desktop usage.

      • clone
      • 6 years ago

      a discussion about Nvidia’s G-sync leads to an AMD rant?……….

      here I’ll offer a counter to the absurdity…… I recently sold my HD 7850 and got 2 Nvidia GTX’s and they’ve been surprisingly bad.

      a smattering of blue screens…..blue screens!! (haven’t had one in 2 years) a dozen+ driver resets while surfing (so often they’d become a part of the experience) and the occasional lockup that forces a restart (2 or 3 a week depending on amount of use)…. it was terrible until did a driver rollback too 314.22. the systems been reasonably solid since the rollback but I’ve got nothing good to say about the experience or the company in general at this point.

      p.s. when I go into sleep mode my monitor won’t shut down….. again. I had this problem with the previous 460 GTX until I bought the HD 7850 but now it’s returned with the return to Nvidia.

    • Prestige Worldwide
    • 6 years ago

    Shut up and take my money!!!!!!!!!!!

      • Jigar
      • 6 years ago

      Wrong choice of words…

      • oldDummy
      • 6 years ago

      Puff….yawn……just leave it on the dresser.

    • Pville_Piper
    • 6 years ago

    Outstanding article as usual for TR. i can’t wait to get my hands on one of these!

    • jihadjoe
    • 6 years ago

    How hard is it to install the G-Sync upgrade kit and do you think an enterprising user might be able to adapt it for use with one of those overdriveable Korean 27″ IPS monitors?

      • wingless
      • 6 years ago

      Nvidia provided a REALLY detailed instruction book. You just need basic tools like screw drivers and needle nose pliers. It took me ~45 minutes because I was super careful. I think the issue with installing this with other panels would be the cables that connect it to the backlight and panel. I’m not sure if they are industry standard connectors.

        • ronch
        • 6 years ago

        A its current state G-sync is strictly in geekmod territory. No normal person is gonna pop his monitor open to do the surgery required, instruction book or not. Heck, I know someone who would probably faint at the sight of circuit boards and solder!

    • JosiahBradley
    • 6 years ago

    If you are not maintaining a 144fps then the V-sync result at 144Hz will be jumps between 144fps/72fps/36fps. This is why G-sync looks smoother than VSync in the videos. However if you could test on an older game that could hold minimum frame rate at 144fps, then the videos would look much similar, indeed they’d look identical because the monitors refresh is still upper bounded.

    With that said it is a cool technology. One I’ll never buy until it’s an open standard however (included in DisplayPort and VESA).

    I do however miss >60Hz refreshes like we had with CRTs and would love to get that back.

      • Firestarter
      • 6 years ago

      [quote<]However if you could test on an older game that could hold minimum frame rate at 144fps[/quote<] Not very relevant for most of us, right? [quote<]indeed they'd look identical[/quote<] Close maybe, but G-Sync would be better at compensating for variation in how long it took for a frame to render. This leads to better animation fluidity than with 144fps vsync.

        • JosiahBradley
        • 6 years ago

        I was simply answering the question of why vsync at 144 still had a jitter. Gsync only works when fps are less than the refresh rate. gsync is equal to vsync when the fps is higher than the capability of the display. For those of us with 60hz monitors with capable hardware we may never dip below 60fps even in modern games, so the extra cost and lock-in may not benefit us much. The story changes of course when the panel can refresh super fast, giving you more fluidity, but then the cost sores having to buy new monitor(s) and new video cards, for something we have been used to for 20+ years of gaming.

        I’m all for the technology here, just don’t care for something I can’t buy or utilize due to lock-in.

      • Klimax
      • 6 years ago

      How far back? Chaser, Doom 3,…
      (Quite few games won’t even allow that high FPS.)

Pin It on Pinterest

Share This