Adaptive-Sync added to DisplayPort spec

PC gaming animation may soon become more fluid than ever, thanks to a development just announced by the folks at the VESA display standards organization. VESA has officially added a feature called Adaptive-Sync to the DisplayPort 1.2a specification, which means that a G-Sync-style adaptive refresh mechanism could be built into nearly every new desktop monitor in the coming months and years.

The press release explains exactly what’s at stake:

NEWARK, CA (12 May 2014) — The Video Electronics Standards Association (VESA®) today announced the addition of Adaptive-Sync to its popular DisplayPort 1.2a video interface standard. This technology delivers several important capabilities to computer users: Adaptive-Sync provides smoother, tear-free images for gaming and judder-free video playback. It also significantly reduces power consumption for static desktop content and low frame rate video.

Computer monitors normally refresh their displays at a fixed frame rate. In gaming applications, a computer’s CPU or GPU output frame rate will vary according to the rendering complexity of the image. If a display’s refresh rate and a computer’s render rate are not synchronized, visual artifacts—tearing or stuttering—can be seen by the user. DisplayPort Adaptive-Sync enables the display to dynamically match a GPU’s rendering rate, on a frame-by-frame basis, to produce a smoother, low latency, gaming experience.

We saw a demo from AMD of such technology in use on a laptop back at CES, which AMD cheekily dubbed "FreeSync," since the tech didn’t require a cost-adding G-Sync module in order to work. That feature, however, only exists in select laptop displays, where adaptive sync exists primarily for power-saving reasons. With today’s announcement, variable refresh tech becomes part of the spec for external displays, as well, and gets its new name: 

Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA’s embedded DisplayPort (eDP) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync.

The approval of this spec is only one step in a long process before we can expect to see desktop monitors that support Adaptive-Sync on store shelves. The makers of display scaler and control chips still have work to do, and those companies have been rather sluggish in developing ASICs capable of supporting 4K resolutions at 60Hz. (That’s why a lot of the early 4K displays use dual tiles, which is way less than optimal.) 

Still, I think this announcement has to count as a victory for AMD’s "FreeSync" initiative, its counter to Nvidia’s G-Sync push. In fact, the VESA release quotes an AMD engineer espousing the benefits of Adaptive-Sync: 

DisplayPort Adaptive-Sync enables a new approach in display refresh technology, said Syed Athar Hussain, Display Domain Architect, AMD and VESA Board Vice Chairman. Instead of updating a monitor at a constant rate, Adaptive-Sync enables technologies that match the display update rate to the user’s content, enabling power efficient transport over the display link and a fluid, low-latency visual experience.

The fact that a spec update happened is also a bit of a blow to Nvidia, simply because the firm’s G-Sync guru, Tom Petersen, told us at CES that he didn’t think an update to DisplayPort was needed for variable refresh. Evidently, VESA was persuaded otherwise.

The addition of Adaptive-Sync does mean Nvidia has achieved its stated goal of pushing the industry forward on this front. Yet it also means Nvidia’s window of exclusivity, where only G-Sync-compatible displays combined with GeForce graphics cards will offer variable refresh tech, could be fairly narrow. That window had already shrunk somewhat with rumored last-minute changes to the G-Sync module. Issues with the module are apparently responsible, at least in part, for the fact that G-Sync-compatible monitors haven’t yet reached the market as anticipated.

Variable refresh technologies like Adaptive-Sync offer potential benefits beyond added smoothness in gaming. For video playback, display refresh rates could be lowered to sync with low-frame-rate video sources like 24 FPS movies, eliminating the need for inverse telecine conversion. For less intensive desktop workloads where the screen contents are often static, variable refresh tech could allow "the display refresh rate to be reduced seamlessly, lowering system power and extending battery life," according to VESA.

Update: AMD has provided us with a series of questions and answers that clarifies the relationship between its own Project FreeSync initiative and VESA’s Adaptive-Sync feature, as well as offering some insights about when we can expect to see Adaptive-Sync-ready displays. Here are the juiciest bits:

Q: How are DisplayPort™ Adaptive-Sync and Project FreeSync different?

A: DisplayPort™ Adaptive-Sync is an ingredient DisplayPort™ feature that enables real-time adjustment of monitor refresh rates required by technologies like Project FreeSync. Project FreeSync is a unique AMD hardware/software solution that utilizes DisplayPort™ Adaptive-Sync protocols to enable user-facing benefits: smooth, tearing-free and low-latency gameplay and video.

Q: When can I buy a monitor compatible with Project FreeSync?

A: AMD has undertaken every necessary effort to enable Project FreeSync in the display ecosystem. Monitor vendors are now integrating the DisplayPort™ Adaptive-Sync specification and productizing compatible displays. AMD is working closely with these vendors to bring products to market, and we expect compatible monitors within 6-12 months.

Q: What is the supported range of refresh rates with FreeSync and DisplayPort™ Adaptive-Sync?

A: AMD Radeon™ graphics cards will support a wide variety of dynamic refresh ranges with Project FreeSync. Using DisplayPort™ Adaptive-Sync, the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display. Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz.

Q: What AMD Radeon™ GPUs are compatible with Project FreeSync?

A: The first discrete GPUs compatible with Project FreeSync are the AMD Radeon™ R9 290X, R9 290, R7 260X and R7 260 graphics cards. Project FreeSync is also compatible with AMD APUs codenamed “Kabini,” “Temash,” “Beema,” and “Mullins.” All compatible products must be connected via DisplayPort™ to a display that supports DisplayPort™ Adaptive-Sync.

I think there are a few big-ticket takeways from AMD’s statement.

First, there’s the time frame of six to 12 months before Adaptive-Sync displays become available. That’s pretty wide latitude, and we’re talking about the first wave of compatible monitors. A year could pass before Adaptive-Sync displays ship to consumers (or longer if that estimate is off.) Assuming Nvidia and its partners get their G-Sync hardware working soon, the window for G-Sync exclusivity could still be fairly broad.

Second, I’m happy to see that the ranges of refresh rates possible with Adaptive-Sync displays is so wide and extends to relatively low numbers. The early G-Sync hardware we tested didn’t go quite as low, and Nvidia was clearly missing some of the benefit of this tech in performance-limited scenarios, where the user can feel its impact the most. I do expect production G-Sync modules to lower the refresh limit, too, although that’s nothing I’ve confirmed with certainty.

Finally, it’s clear VESA’s effort and AMD’s Project FreeSync are not to be confused. Nvidia, Intel, and other GPU makers participate in standards bodies like VESA and are likely to support DisplayPort 1.2a with Adaptive-Sync, as well.

Comments closed
    • HisDivineOrder
    • 6 years ago

    My God.

    This has taken a lot longer than it should have! Seriously. Plus, when they were spec’ing a new spec for encoding video signals to be transmitted a month or so ago, I kept wondering why they weren’t addressing the obvious way of helping to reduce bandwidth requirements… by reducing our reliance on bandwidth to basically send a, “Repeat that image” over and over on our desktops while they sit displaying a start bar, a background, and some icons for hours on end. Sure would save bandwidth if it could just send a, “Yeah, you know that stuff I told you to send a minute ago? Keep doing that… till I tell you otherwise.”

    Bam, insta-bandwidth savings. That’d have to save power, too.

    So yeah. Good thing for everyone. Well, everyone except nVidia maybe.

    Sure wish they could find a way to give me a Display Adapter peripheral for my dual-DVI 2560×1600 monitor though. Seems wasteful to dump it for something that should be able to be done through a converter box to give me better refresh rates (even if it introduces a smidge of lag). I’m already enduring lag anyway with conventional vsync (or adaptive vsync).

    I think that “Adaptive sync” naming may be AMD’s ultimate jab at nVidia since the latter has its own “adaptive v-sync” that will now be eternally confused with it.

    • moose17145
    • 6 years ago

    I am just happy my R9 290 will support variable refresh rates when the monitors become available. Will be nice if they do in fact become available to the consumer within a year. For me it means I will not have to update my video card just to get variable refresh rates. If the monitors are not available until afer a year, then I will have to see what is happening in the videocard world at that time and see if a new card would be worth it.

    Also, I wonder if you have a device that already has display port 1.2 if you can just flash the firmware and get DP 1.2a capabilities, or if you are going to have to buy a new device with 1.2a ports.

    • Ninjitsu
    • 6 years ago

    From [url=http://www.anandtech.com/show/8008/vesa-adds-adaptivesync-to-displayport-12a-standard-variable-refresh-monitors-move-forward<]AnandTech[/url<] [quote<] Finally, we also had a brief chat with NVIDIA about whether they would support Adaptive-Sync on current generation hardware. NVIDIA tells us that they can’t comment at this time since there aren’t any Adaptive-Sync displays available. It’s entirely possible this is just NVIDIA being coy, however like all device vendors they do have to pass the VESA’s compliance tests. So if nothing else NVIDIA’s “no comment” is technically correct: until they pass that test they are limited in what they can say about being Adaptive-Sync compliant. [/quote<]

    • Laykun
    • 6 years ago

    nvidia : “Hmm, we need to make g-sync relevant if we still want to lock customers into our eco-system”

    6 months later

    nvidia : “G-sync now processes PhysX commands directly in the screen and is no longer locked into the internal tick of the game engine!”

      • alientorni
      • 6 years ago

      LOL!
      that there’s a lot of nvidia fanboys here but this was a very good one…

    • Klimax
    • 6 years ago

    Seems there are some interesting differences between tech, so it might be fun with comparisons…

    • Ninjitsu
    • 6 years ago

    Well, this is good news, and expected since the FreeSync announcement. Sadly I can’t take advantage of it (just got a new monitor a few months ago) but with min 60 fps @1080p with a midrange card slowly becoming possible, that’s not too much of an issue.

    Hopefully the next GPUs from both AMD and Nvidia will support DP1.2a or 1.3, DX12 is probably a given. Finally, LCDs can leave their CRT legacy behind (though i wish response times were equally low!).

    I think Adaptive-Sync is also necessary for 4K, because most GPUs today still can’t hit 60 fps at high detail settings on that resolution.

    • Anomymous Gerbil
    • 6 years ago

    Does anyone know why only restricted ranges of refresh rates are allowed (e.g. 36-240Hz, 21-144Hz etc), rather than just zero-maximum allowed by monitor?

      • Meadows
      • 6 years ago

      I believe there are two reasons:

      1. Monitors don’t support infinitely low refresh rates,
      2. If your framerate is lower than the minimum refresh rate, then the driver can just use a multiple of your framerate and be fine that way. E.g. if your game runs at 15 fps, the driver can just set a refresh rate of 45 Hz and still never skip a beat.

      At least I hope that’s how it works, otherwise the designers have been stupid.

      • Prion
      • 6 years ago

      It’s hard enough to avoid color shifts on certain types of LCD panels when switching between fixed refresh rates, I’d hate to see how bad the cheap implementations are going to get it when trying to adjust for it in real-time.

    • mark84
    • 6 years ago

    If the R9 290X will be supporting this, by extension does that mean the R9 295X2 is compatible as well?

    • joyzbuzz
    • 6 years ago

    “The addition of Adaptive-Sync does mean Nvidia has achieved its stated goal of pushing the industry forward on this front.”

    Wrong. AMD expressed surprise when Nvidia introduced G-Sync because they said the addition of Adaptive Sync to the DisplayPort standard was already a done deal and would be announced within a matter of months. Nvidia did SQUAT to ‘push the industry forward on this front’, it was a purely a PR play to make it LOOK like they were a leader in this.

    “A year could pass before Adaptive-Sync displays ship to consumers (or longer if that estimate is off.) Assuming Nvidia and its partners get their G-Sync hardware working soon, the window for G-Sync exclusivity could still be fairly broad.”

    Coulda, woulda, shoulda, the technology already exists, chances are it will appear in displays sooner than later, the monitor manufacturers are certainly aware there will be a sizeable market for the new display port standard, a FAR bigger market than the much more expensive and vendor limited Gsync monitors. The ‘window’ between Adaptive Sync monitors and a working Gsync could just as well be a couple of months or less. Why would monitor manufacturers even bother at that point.

      • mutantmagnet
      • 6 years ago

      Surprised this post got multiple thumbs up. If it was a done deal AMD wouldn’t have needed to do an entire initiative to convince VESA to make these changes.

      Noone knows how long it will take 1.2a to be adopted. Heck gsync has been rolling out even slower than expected.

    • Forge
    • 6 years ago

    My next big monitor purchase is slowly coming together! I need 4K, a quality panel with good viewing angles, and 120Hz support, and now Adaptive sync. That ought to be worth the thick end of a grand!

      • Airmantharp
      • 6 years ago

      I’ll add 50%-70% more for great color, uniformity, and in a ~28″ form factor with available VESA mount.

      Needing a single monitor for gaming AND for photography is one of the most stressful starting points for the pocketbook!

        • Ninjitsu
        • 6 years ago

        Why not just get two? A TN or similar cheap display for gaming, more expensive/accurate one for photography

          • Laykun
          • 6 years ago

          Some people like to have their games not look like a pile of washed out turds.

            • Krogoth
            • 6 years ago

            >implying games are proper source material to demonstrate the differences in color gamut.

            • Airmantharp
            • 6 years ago

            Not for color gamut, no, but real dynamic range and real contrast get pushed pretty hard.

            • Laykun
            • 6 years ago

            >implying that source material for games needs to be colour correct for them to be pleasing to the eye.

            • Laykun
            • 6 years ago

            I play games on a wide-gamut IPS and it looks lovely. It might not look ACCURATE but I personally feel it looks great. Visuals in video games are SUBJECTIVE.

          • Zizy
          • 6 years ago

          Why get two, when a single OLED screen should be able to do it all better than all the LCD techs combined.

            • Laykun
            • 6 years ago

            And last 2 years till burn-in.

            • Airmantharp
            • 6 years ago

            Still waiting on those pro-grade OLED displays at near-consumer prices. Not saying that they won’t exist, and I agree that aside from life expectancy that they’ll probably do the job better than LCDs ever could, but they’re not here yet :).

      • moose17145
      • 6 years ago

      I tend to agree. A decent IPS or VA panel, 60 hz refresh (minimum), and adaptive/free-sync, and 4k. Because of 4k I will stick with a 60hz minimum refresh, since bandwidth isn’t there yet for a true 120hz refresh (unless you teamed up a paid of Display Ports maybe…). And and the monitor has to have a single large tile. None of those dual 2k virtual display hack jobs that causes endless problems. That is what I want for my next monitor. And I AM in the market for a new monitor. But with the way things are going, I will hold off on that purchase for a bit longer till I can get all those things.

      • Sabresiberian
      • 6 years ago

      Add a graphics card that will support such a display to the cost. 🙂

      • floodo1
      • 6 years ago

      you forgot one: it needs to cost < $300
      hahahhahahhahah

    • Krogoth
    • 6 years ago

    I’ll give it at least two years before this spec get implemented in most monitors and that’s being optimistic. Just look how long it took for Displayport monitors to start appearing on the market en mass after the interface got finalized.

    Just don’t get too excited over this.

      • Deanjo
      • 6 years ago

      Even then DisplayPort seems to be implemented only in the higher end monitors.

        • Firestarter
        • 6 years ago

        Still, anyone looking to cater to the gaming market (higher end TN monitors) would jump on the opportunity to market something that increases responsiveness and reduces lag. And this time it would even be warranted!

      • Ninjitsu
      • 6 years ago

      Maybe this will help push marketing? Especially in conjunction with 4K…

      So far, for a regular 1080p monitor, there was no real “killer feature” to DP (unless there was, i wouldn’t know)…but now they have something to shout about.

      And if the new DX12 cards all have this as well, then they’ll shout about it too.

      • alientorni
      • 6 years ago

      i think that demand could push this technology forward. current display port hasn’t that many improvements over regular digital inputs.

    • ALiLPinkMonster
    • 6 years ago

    Among the best news I’ve heard so far this year.

    • Bensam123
    • 6 years ago

    And this is why Nvidia is going to lose out if they’re intent on locking all their bells and whistles down. They effectively made a product that has been obsoleted in less then a couple months and no one is going to go back to it if it’s simply included in a specification everyone has access to.

    I don’t think 6-12 months is a big window. Considering how often people buy monitors, they’re more then likely to put off a monitor purchase to get optimal value. It’s not like they lose out on anything since they weren’t able to use it in the first place.

    I wonder if we’ll see ‘high end’ monitors with firemware upgrades to enable this technology or ‘upgrade’ scalers for monitors. I own a VG248QE and g-sync modules are available. Taking that a bit further, it’s too bad monitors aren’t more upgradeable when it comes to things like this, although that usually depends a lot on the display.

    Interestingly, how does this ‘reduce latency’? I definitely can see the benefits as far as screen tearing goes, because it’s a type of v-sync, but how does it reduce latency?

      • Voldenuit
      • 6 years ago

      [quote<]Interestingly, how does this 'reduce latency'? I definitely can see the benefits as far as screen tearing goes, because it's a type of v-sync, but how does it reduce latency?[/quote<] By delaying the screen refresh until the GPU is finished with the frame, this means that the display can display a frame when it is done instead of having to repeat the last frame (16 ms latency) and then push it out at the next screen refresh (another 16 ms latency, for a total of 33 ms latency).

        • Airmantharp
        • 6 years ago

        The only nit to pick here is that G-Sync provides for an on-monitor buffer, while Adaptive-Sync just rips all of that stuff out or bypasses it altogether.

        I’d be inclined to believe that G-Sync might incur a greater input lag penalty due to extra buffering in order to get around the limitations of the current implementation of DP.

        • Bensam123
        • 6 years ago

        So if you take into account a ‘less optimal’ refresh if it’s pushed out right when the next frame become available? This wouldn’t be that meaningful with really high refresh rate monitors, but I could see that having a impact with 60hz displays.

          • Voldenuit
          • 6 years ago

          Presumably the GPU drivers will have some form of algorithm to space frame buffer updates within a certain tolerance of the display update cycle. It shouldn’t be *too* different from what they already to with Vsync/Adaptive Vsync, only with a moving target.

          Since GSync/Freesync/ASync work best at lower framerates, it’s not like the GPU will be pushing frames faster than the display can keep up, more likely the other way around.

          • Airmantharp
          • 6 years ago

          Higher refresh rate monitors due diminish some of the advantages, but the primary advantage remains intact- variable framerates are displayed far more smoothly.

      • NeelyCam
      • 6 years ago

      Sort of like AMD with Mantle…?

      The difference is, though, that NVidia never loses.

        • Essence
        • 6 years ago

        You mean like the GTX 680 vs HD 7970 GHz? Or the GTX 690 vs HD 7990? And what about the R9 295X vs ?

          • Airmantharp
          • 6 years ago

          Those aren’t wins (or losses). The FX5800 was a loss.

            • JustAnEngineer
            • 6 years ago

            I remember buying a Radeon 9700 Pro in August 2002 when it launched. When GeForce FX5800 finally arrived in stores [b<]seven months later[/b<], its DX9 performance was a joke. NVidia and their shills embarked on a full-on smear campaign against 3DMark because the benchmark showed that GeForce FX sucked at DX9. When DX9 games finally appeared, it turned out the 3DMark was right. However, NVidia's biggest failure by far was NV1.

            • Airmantharp
            • 6 years ago

            Yeah, Nvidia bet wrong on DX9- something about ATi going 24bit for their pipelines and them being the ‘reference design’ with developers following suit, while Nvidia had 16bit pipelines that could do 24bit or 32bit at half speed. They caught up with the next generation, of course, but the ‘FX’ series was a rough one.

            And I hear you on NV1.

        • Bensam123
        • 6 years ago

        Mantle isn’t vendor locked. Mantle also doesn’t require dedicated hardware.

          • MathMan
          • 6 years ago

          Can I have some of what you’re smoking?

          I mean: even the API is only available under restricted NDA, and you need an AMD card. How is that not dedicated hardware?

          Or do you actually believe the ‘Mantle is open’ story?

            • Bensam123
            • 6 years ago

            I think you’re confusing beta stages with something that’s going to be vendor locked bro. They already announced the SDK will be coming out this summer.

            How often do developers or hardware manufacturers give people hardware or software they’re working on when it’s not reached that stage? It’s not out yet, therefore it’s vendor locked?!?! oO I’d honestly like what you’re smoking.

            • MathMan
            • 6 years ago

            The only thing that I can find officially from AMD about Mantle being open, is from their FAQ page (http://support.amd.com/en-us/kb-articles/Pages/mantle-faq.aspx):
            “Our intention is for Mantle, or something that looks very much like it, to eventually become an industry standard applicable to multiple graphics architectures and platforms.​”

            Did you note the ‘or something that looks very much like it’. Why in the world would they add such cop out? Do you know something that looks very much like it? I do… It’s called DX12.

            Just like GSYNC, Mantle has served its purpose: it has prodded the industry in a direction it otherwise wouldn’t have gone. Eventually, it will die and the multi-vendor alternatives will take over. It always happens that way.

            • Bensam123
            • 6 years ago

            I think you’re confusing open source with proprietary with vendor locked. That seems to be a common mistake here. Something can be proprietary, but not vendor locked… like directx. Something can also be proprietary as well as vendor locked, like gsync.

            Mantle is proprietary, but not vendor locked. Meaning Nvidia can implement mantle in the future if they want after AMD releases the SDK. However AMD cannot implement gsync, because Nvidia locks AMD out of their hardware (and of course has patents). The same applies to PhysX.

            • Voldenuit
            • 6 years ago

            Good call. It would be as absurd as claiming PhysX is “open” since it

            a. Doesn’t charge developers license fees
            and
            b. Runs on x86*

            Both are true, but it’s still proprietary as all get out. Good thing I haven’t seen anyone try to claim PhysX is “open”.

            * EDIT: Come to think of it, does PhysX even still have an x86 codepath? I haven’t heard much about that since before AGEIA got bought up by nvidia.

            • Bensam123
            • 6 years ago

            Try accelerating PhysX on AMD hardware. It’s vendor locked bro.

            • Voldenuit
            • 6 years ago

            I think you missed the bit where I said that claiming PhysX is open is as absurd as claiming Mantle is “open”.

            If and when AMD puts their money where their mouth is and releases the SDK, then we can reclassify it. But until then, promises don’t feed horses (is that even a saying?).

            • Bensam123
            • 6 years ago

            Why do people keep assuming AMD wont release the SDK when they said they’re going to release it this summer? Does AMD have a history of saying something is going to be open to other vendors and then lock it down? Does AMD actively go out of their way to lock out competitors? Do we have ANY REASON TO ASSUME AMD WONT RELEASE THEIR SDK? No.

            This has nothing to do with putting money where their mouth is, this is a release schedule for software.

            Guilty until proven innocent seems to be a thing around here only when talking about AMD.

            • Airmantharp
            • 6 years ago

            AMD has earned the suspicions that surround them- most of us have been here long enough to witness them over-promising and under-delivering, or just flat-out ignoring issues.

            So yeah, until they actually release the SDK, Mantle is vendor locked.

            • nanoflower
            • 6 years ago

            Even if they do release it so that other vendors can implement Mantle it may not matter. With implementing Mantle on Intel or Nvidia hardware taking quite some time and Direct X 12 coming next year it’s possible that Intel and Nvidia won’t bother with Mantle. So while there may not be vendor lock in Mantle may remain as only being available from one vendor.

          • Klimax
          • 6 years ago

          So when will cards pre-GCN get it? Mantle is already locked to GCN-only hardware, so even if that was open, nobody will use it.

          And why would they even consider it when there is proper DX 12 arriving. (Which predates Mantle anyway)

            • Bensam123
            • 6 years ago

            It’s locked as in AMD is actively making sure other manufacturers don’t implement, like Gsync, or ‘vendor locked’ as in they haven’t received the SDK yet so there is no possible fucking way they could develop hardware for software they don’t have?

            This conversation has nothing to do with DX12, we’re talking about gsync being vendor locked.

        • Wild Thing
        • 6 years ago

        Sure…Titan z has to count as a big win right?

    • Amazing Mr. X
    • 6 years ago

    This is exactly what we needed.

    Ever since I got my G-Sync upgrade kit from The Tech Report, I’ve been amazed at how wonderful and horrible the technology has been. Sure, it works and, what it does it does exceptionally well. However the kit features technologies that are rendered incompatible with one another in software, and Nvidia has yet to release a driver with any release notes detailing upgrades or bug fixes to G-Sync technology. To put it bluntly; Nvidia has been resting on their laurels when it comes to G-Sync, and that probably has more to do with a lack of competition in this arena more so than anything else.

    At least, until now.

    Hopefully, this will be the kick in the pants Nvidia needs to start smoothing out the G-Sync user experience. Getting its basket of monitor technologies working together in a user-friendly way should become a top priority for Nvidia if they really want to compete and move units moving forward. There is a very compelling product in the G-Sync technology suite, I’m sure of it, but if Nvidia doesn’t start taking things seriously with their support they’re going to land up completely buried under this open standard AMD has helped VESA to implement.

    [i<]TL;DR: Open technologies and standards benefit everyone, hopefully even G-Sync users.[/i<]

      • Airmantharp
      • 6 years ago

      Could you be more specific as to the issues you are experiencing when using G-Sync? Thanks!

        • Amazing Mr. X
        • 6 years ago

        I’ve literally tried approaching a response to this four separate times, and have yet to formulate something that isn’t an incomprehensible wall of text. So, I apologize if this final attempt at an answer comes across as nothing but unintelligible blabber. The issues here are just extremely complex, very obvious, and far too broad for me to [i<]easily[/i<] encapsulate them into the space of a single comment. I apologize in advance. That said, G-Sync as a technology works fine. Content between the range of 144 to 30 fps running as a full-screen application in Windows will always run at a respective refresh rate that matches the content with a 144Hz monitor. That's been true of every single program I've thrown at it thus far. As well, G-Sync automatically uses Nvidia's wonderful Adaptive V-Sync to remove tearing artifacts above the maximum refresh rate of your monitor. So far, this works perfectly as well, and manages to remove any traces of screen tearing and stuttering alike. These two services comprise a technology that Nvidia calls G-Sync, and they function well. However, the G-Sync technology is not all that Nvidia has packaged into the G-Sync Kit. In fact G-Sync, as I've just described it, is one of three entirely Nvidia exclusive technologies that appear to be shipping in every single G-Sync kit. These technologies are: G-Sync, ULMB, and 3D Vision. Now here's where things get complicated and difficult to explain. Switching between G-Sync, ULMB, and 3D Vision is impossible to do outside of the Nvidia Control Panel. So if you accidentally boot up a game in 3D Vision, and you rather play it with G-Sync, you have to close the game and go back to the Nvidia Control Panel. If you want to take a shooter particularly seriously and eliminate motion blur in ULMB mode, after playing some single player action with G-Sync, you've got to close the game again and go back to the Nvidia Control Panel. You have to do this all of the time, for each game, depending on how you prefer to play that game. Why? Because none of the three big technologies in a G-Sync kit work together. They just don't. It's an impossibly glaring oversight of the G-Sync kit's design and implementation, but the strobing backlight of 3D Vision and ULMB only strobes at one of three fixed refresh rates. Both only appear to work at 120, 100, and 85 Hz. Beyond or below these values you get nothing, and being as G-Sync doesn't have a fixed refresh rate, that means you're essentially stuck not using these technologies at the same time. Now this wouldn't be too much of a problem if it were a simple or straightforward process to switch between G-Sync, ULMB, and 3D Vision, but of course it isn't. In addition to requiring entirely different configurations in the Nvidia Control Panel the optimal configurations for something like ULMB can involve frame rate limiters, and 3D Vision may require additional plugins or 3rd party patches that don't work well or at all in 2D viewing modes. This makes actually playing your favorite games a juggling act of pregame settings configuration. Heaven forbid if you want to play something casually, or because you feel like it, because the very next thing on your mind is going to be all of the settings you're going to have to fiddle with just to get the game working the way you want to play it. Of course all of this wouldn't be a problem if there were keyboard shortcuts for turning this stuff on and off, but there aren't. Even just instantly turning G-Sync on and off when you press the 3D Vision and ULMB buttons would be okay, but Nvidia doesn't do that either. Ideally the G-Sync Module would strobe the backlight in time with the frame rate, making G-Sync inherently compatible with 3D Vision and ULMB, but Nvidia definitely isn't doing that right now and I have no idea why. You see, the G-Sync kit's PCB is literally twice as big as my GTX 680. It was so massive that Nvidia forces the removal of all of the internalized power components of a VG248QE, just to barely squeeze this massive thing in the back of my monitor with an all-new and previously unnecessary external power brick. I literally cannot understate the size of this thing, it is bigger than two ITX motherboards, and that's before you start considering the fact that there's another daughter card plugged into it. That daughter card is the thing you seen in all of Nvidia's Marketing screen shots, and it's just a small card sitting on an extremely huge and reasonably busy PCB. So no, with that much obvious engineering going on with this thing, I refuse to believe that this isn't some sort of driver-correctable problem. There's no reason for Nvidia to shove more PCB in a monitor than most people have in their entire computer, if they can't even get the backlight to strobe when they tell it to. I'm sure Nvidia is literally just sitting on updates to this, or firmware for this thing, because it's very clearly far from perfect. That, and it doesn't always detect my one video input correctly from a cold boot, falsely giving a box declaring no video signal over my Windows Login screen at times. It's just a glitch with the OSD, sure, but this is what I'm talking about. This kit, without serious software upgrades, isn't ready for prime time. I have a fairly high tolerance for convoluted and impossible technology, and all of this is even starting to grate on me after several months. Once more, a complete lack of Driver updates, thus far, that even casually reference G-Sync in the release notes isn't a good sign either. My hope, is that this new competing open standard forces Nvidia to rethink their position with G-Sync support. The G-Sync kit is an interesting product with a lot of promise, however right now it's considerably more promise than anything else. The G-Sync technology itself is fine, but it needs more compatibility with the rest of Nvidia's technologies, and it needs them before Nvidia starts flooding products onto the market. I like the idea of what Nvidia is offering, I like the concept of officially branded Nvidia certified monitors that offer the ultimate gamer package of +120Hz, G-Sync, 3D Vision, and ULMB. However, the current G-Sync kit is quite far and away from an enjoyable, consumer-friendly product, and is still in clear need of some much needed love.

          • Airmantharp
          • 6 years ago

          Thanks!

          Here’s what I’m seeing:

          1. You’re actually trying to use ULMB and 3D Vision, which aren’t actually part of G-Sync at all, but are part of the ASUS monitor that G-Sync debuted on. You could use both before installing the G-Sync kit, though I hear that ULMB was improved somewhat. And the issue is that these solutions require switching around settings in the driver software depending on the game, somewhat like having to switch Crossfire/SLI on and off. I encountered this with Crossfire and switching between BF3 and Skyrim when they were released, and I agree that it was a pain in the ass.

          2. You’re running what is essentially a well-developed beta- I mean, as you say, it does work- but it is quite over-engineered. That’s something we all agree on; the functionality enabled by G-Sync should be able to evolve into something far simpler than what Nvidia produced for the first run, and certainly far cheaper.

          Here’s what I think: Having to switch driver settings for three different technologies with very different goals is somewhat of a given. The fact that 3D Vision isn’t very well supported and requires game hacks to work sucks, but it’s the same problem that PhysX faces in that market penetration just doesn’t warrant the cost of developer support. This stuff is literally cutting edge technology for consumers, even if the basic principles aren’t that complicated.

          So be happy! You get to experience gaming in a way that almost no one else out there can. So what if it’s a pain in the ass?

          🙂

    • wingless
    • 6 years ago

    My G-Sync monitor is F’ing EXTRAORDINARY! I hope this spec brings real G-sync level performance to all displays. It makes a big difference at all framerates, but is very noticeable at lower framerates.

    • Parallax
    • 6 years ago

    Now for the important part: How will software (OS/drivers) handle multiple requests for different frame rates?

    If I want to show a 24Hz video windowed on a 60Hz desktop, does the operating system bump the refresh rate to the least common multiple of 120Hz to sync them? What happens if 120Hz is not available?

    If I’ve got 2 displays with different available refresh rates, which is chosen as the default? Would a secondary display be forced to whole fractions (e.g. 1/2, 1/3) of the primary in order to sync them?

    If I want to always run at the maximum supported refresh rate limited by cable bandwidth, can I do so (like some G-SYNC implementations)?

    These are basic questions that I have not seen anyone address yet.

    • chuckula
    • 6 years ago

    Hrmm… pros & cons:

    Pros:
    Simpler than G-sync
    More cross-platform ready than G-sync.

    Cons:
    G-sync can deliver superior results at the price of being more complex and Nvidia-only.
    It looks like you still need an upgraded monitor, albeit the upgrades are less expensive than G-sync controllers… but then again, maybe monitors will end up getting both if they need hardware upgrades at all.

      • superjawes
      • 6 years ago

      I’m really curious to see whether these are, in fact, the “same thing.” If they are, I’d also like to hear why Nvidia decided that they needed a custom solution. However, it will be more interesting to see if Adaptive-Sync runs into problems that Nvidia already solved, or if GSync solutions provide a better end result.

      Damage Labs will be quite busy, lol.

        • Airmantharp
        • 6 years ago

        Nvidia needed a custom solution because no one else would move an inch without customer demand, which wouldn’t exist until someone like Nvidia stoked with a custom solution 🙂

        • Voldenuit
        • 6 years ago

        [quote<] I'd also like to hear why Nvidia decided that they needed a custom solution.[/quote<] I imagine it's because nvidia could not modify the signalling standards in DVI and DP, whereas AMD was merely leveraging existing functionality in LVDS (which was incorporated to solve a different issue, i.e. battery life). Bottom line is that G-sync and Freesync probably both played a part in VESA's decision to pay attention to an issue that had not previously been a priority. Early (G-sync) adopters get burned, but then, that's always the case with early adopters.

          • Airmantharp
          • 6 years ago

          Early G-Sync adopters get G-Sync- it’s hard to complain about that!

          • cygnus1
          • 6 years ago

          Just to be clear it wasn’t LVDS that was leveraged, it was eDP. VESA already has this tech included in eDP and have probably mostly copied and pasted it into DP 1.2a.

      • Airmantharp
      • 6 years ago

      Monitors could definitely support both- there’s nothing stopping a G-Sync equipped monitor from support Adaptive-Sync, since it would just bypass the G-Sync module altogether.

      Same with Adaptive-Sync monitors- in theory, you don’t need a scaler for them, but in practice, you will in order to remain compatible with systems that don’t support Adaptive-Sync as well as to support DVI/HDMI/Analog input. If you need the scaler, there’s no reason to include a standard one instead of a commercialized G-Sync-supporting ASIC as well.

      It looks like the performance advantage goes to the Adaptive-Sync setup, though, because it eliminates the monitor’s scaler and associated buffer in the render chain, reducing input lag by some degree.

      • Entroper
      • 6 years ago

      [quote<]G-sync can deliver superior results[/quote<] In what way?

        • chuckula
        • 6 years ago

        This solution has 1 frame of latency, which is on the order of 33 to 16 milliseconds over 30 – 60 FPS rates. G-sync has much tighter control between the generation of the frame and the sync to the screen that pushes latency down to about 2 milliseconds or so.

          • Entroper
          • 6 years ago

          Brilliant. An idea to nearly eliminate latency becomes an implementation that guarantees one full frame of latency.

            • Voldenuit
            • 6 years ago

            [quote<]Brilliant. An idea to nearly eliminate latency becomes an implementation that guarantees one full frame of latency.[/quote<] If true, that is pretty lame. Oh well, it should at least be good for smoothness, right (for both games and movies)?

          • Entroper
          • 6 years ago

          I see that chuckula’s comment is at -5. Does anyone actually have information supporting or refuting his answer to my question?

            • moose17145
            • 6 years ago

            This. I would very much like to know more techy details about how these two are implemented and how they are different from one another. I was under the impression that Adaptive / FreeSync was as good or better than G-Sync in terms of performance. The one frame buffer would prove to be a real problem though, especially if you are dipping into rather low frame rates during the game.

      • Wild Thing
      • 6 years ago

      First Titan Z…. now G-sync,must be a horror month for NV fans…

        • Sabresiberian
        • 6 years ago

        Titan Z is a non-issue for all that can’t afford a single video card costing $1500, never mind $3000. Both those prices put them in the realm of “buyers with a lot of spare cash”, so it’s just a matter of how relatively wealthy you are. Neither solution makes sense when you can get performance similar to the single-card solution anyway by using 2 separate cards for a significantly lower cost.

        Why would Nvidia fans be opposed to a better standard for everyone? Almost all of us don’t own G-Sync monitors now, and most of us won’t be buying a new monitor in the next year. It is as much a win for us as it is for AMD “fans”.

        You clearly missed the part where Nvidia said their primary intent was to push the industry into a better graphics standard for everyone. That happened. This mirrors AMD’s intent for Mantle. Both companies deserve credit for making things better for us all, and only a really negatively spirited individual would cry about how it was done.

        And, personally, I like to get paid for my work, and certainly don’t blame Nvidia or AMD for trying to net a short-term positive cash flow (knowing it would be short term because the work would not, could not remain proprietary) to help cover their development expenses. I man, what would you do if your employer asked you to create something and told you that you wouldn’t be paid for it?

    • davidbowser
    • 6 years ago

    I wish I could take credit and claim they [url=https://techreport.com/discussion/25788/a-first-look-at-nvidia-g-sync-display-tech?post=789955<]took my advice[/url<], but I like it anyway. As long as it is unencumbered, I am will usually side with the open standard. The takeaway should be that AMD and Nvidia were both moving in the right direction on this and that their combined momentum carried weight with VESA. Hopefully, Nvidia will pivot quickly and move in the direction of the standard. With this announced, I would be REALLY surprised to see OEMs spending any significant money on the proprietary implementation.

    • Chrispy_
    • 6 years ago

    I can’t be bothered to read into this since displays with this tech will be 6-12 months away, but does anyone know if this can be back-ported to existing Displayport displays?

      • Airmantharp
      • 6 years ago

      Nope- in order to do this, you either

      a) need to have the display set up like laptop displays, where there is no scaler or other hardware and the panel is essentially driven by the GPU from the frame buffer,

      or b) you need hardware in the monitor that’s equivalent to the G-Sync module, which allows the monitor to be driven using mostly normal DisplayPort signaling.

      The whole point is to remove the effect that the built-in scalers and associated buffers in current monitors impose on the timing of display refreshes, either by getting rid of it altogether (Adaptive-Sync/Free-Sync) or by replacing it with something that’s more flexible, a la G-Sync.

    • USAFTW
    • 6 years ago

    There goes GSync… Vesa v-sync standard
    There goes Mantle… DirectX 12 coming (eventually)
    Why do they do that? They announce their proprietary standards knowing that within months their effort will be ignored? At least mantle got implemented here and there with mixed results, but it seems to me GSync will die before arrival.
    Does GSync work with DVI ouputs? It requires DisplayPort, but when Vesa implements their own open standard, why bother?

      • DPete27
      • 6 years ago

      Because if they don’t come to market with their proprietary stuff, the universal standards would never change.
      I do agree that it would be economically more efficient to just go to VESA / DX and get the standards updated instead of “wasting” so much time and money on working prototypes. But you have to have a proof of concept to do that. Since the universtal standards take time to change, AMD / Nvidia likely saw an opportunity for profit by releasing their proprietary stuff (proof of concept) to retail in the interim.

        • USAFTW
        • 6 years ago

        I do think that NVidia’s GSync and AMD’s mantle worked as bumpstarts for these recent developments which despite their proprietary nature, benefit us as end-users.

    • invinciblegod
    • 6 years ago

    Basically it is the difference between now and later. Gsync and Mantle are available now at a premium while freesync and directx12 will be available later and presumably cheaper. Gsync already has the monitor circuitry built (but delayed for some reason) while freesync is probably at least a year away. Same for Directx12.

      • Sargent Duck
      • 6 years ago

      [quote<]Gsync and Mantle are available now at a premium[/quote<] Mantle is free...

        • Ninjitsu
        • 6 years ago

        Mantle isn’t free. You need an AMD card. It also requires more work for the software developer.

          • JustAnEngineer
          • 6 years ago

          Millions of folks already have AMD GPUs or APUs. No additional investment is required.

            • Ninjitsu
            • 6 years ago

            Yeah, but disproportionately many millions more [i<]don't[/i<].

      • superjawes
      • 6 years ago

      Remember that the GSync circuits are still in FPGA form, which means that the design isn’t ready to be put into premanent silicon. If they’re delayed, it’s probably because they had/have some unexpexted rework as opposed to just tweaking and optimizing.

      I don’t know enough about GSync and Adaptive Sync to say whether or not they are “the same thing,” but if Nvidia is running into issues with their tech, I wouldn’t be surprised if Adaptive Sync implementations run into similar issues at some point, and that could either delay its rollout or delay implementation in certain monitors. There’s a bit about that [url=https://techreport.com/review/25788/a-first-look-at-nvidia-g-sync-display-tech/2<]here.[/url<] [quote<]Doing refreshes at varying intervals creates all sorts of havoc for LCDs, including color/gamma shifting and the like. I had no idea such problems were part of the picture, but happily, Nvidia seems to have worked them out. You'd never know about any potential for trouble when seeing the early G-Sync solutions in action. [/quote<]

        • cygnus1
        • 6 years ago

        Except this isn’t a new standard. It’s what was already built for eDP, just copied over to DP 1.2a. It’s all fairly well known and just needs to be moved into desktop monitors.

          • superjawes
          • 6 years ago

          No one was talking about this until Nvidia unveiled GSync…

          Also, just because people know about it does not mean that it has been implemented. There still could be growing pains with actually getting it to work, as well as growing pains getting it to scale. Remember that this was presented as a power saving opportunity (scaling down the refresh when not needed), and not a performance based one. Having it at 60 Hz would be nice, but getting it at 120 Hz or more would be better.

            • cygnus1
            • 6 years ago

            [quote<]No one was talking about this until Nvidia unveiled GSync...[/quote<] Definitely not arguing that. I'm actually quite thankful nVidia has pushed this tech into the gaming conversation. I fully intend for my next monitor purchase to have A or G Sync and that is thanks to them, no doubt. But I do feel like there's already been a lot of work done on the VESA spec version of this tech for laptop panels. I think a lot of it will be easily transferable to desktop panels. Well, I hope all that is true anyway, since I want this tech sooner rather than later.

      • joyzbuzz
      • 6 years ago

      Obvious FUD when Mantle is FREE to the end user.

    • wizpig64
    • 6 years ago

    hope nvidia doesn’t stubbornly refuse to support this.

      • puppetworx
      • 6 years ago

      It would be a bad move for Nvidia and their customers. I imagine they will implement their own software/hardware solution to make use of DisplayPort(TRADEMARKED!!! JUST SO YOU KNOW!!!) Adaptive-Sync and may even keep the G-Sync name for it. Of course I doubt they’ll fess up to teir strategy anytime soon, they still have stock to move.

      • joyzbuzz
      • 6 years ago

      This isn’t Mantle, a year from now nearly all monitors with a DisplayPort will support the new standard, hence they will support Adaptive Sync by default. Not supporting Adaptive Sync would drive mass customers to AMD. There’ no chance Nvidia doesn’t have a Adaptive Sync solution in the works.

    • anotherengineer
    • 6 years ago

    “Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync.”

    from
    [url<]http://www.vesa.org/featured-articles/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/[/url<] So my question - when will this be mainstream in (desktop) monitors? Edit - (desktop)

      • Puiucs
      • 6 years ago

      in about 6 to 12 months. depends on how fast they manage to make new monitors. with every new standard it takes an extreme amount of time, paperwork and testing to release a final product.

    • Deanjo
    • 6 years ago

    [quote<]For video playback, display refresh rates could be lowered to sync with low-frame-rate video sources like 24 FPS movies, eliminating the need for inverse telecine conversion. [/quote<] Already can be done by setting the appropriate video mode in full screen. Most monitors already have a native 23.978/24 display modes and don't need to use ivtc if the person has configured their software properly (and it supports switching display modes). The real holy grail is to get that proper refresh rate in windowed playback which I don't see FreeSync providing either. [quote<]The fact that a spec update happened is also a bit of a blow to Nvidia, simply because the firm's G-Sync guru, Tom Petersen, told us at CES that he didn't think an update to DisplayPort was needed for variable refresh. Evidently, VESA was persuaded otherwise.[/quote<] And Tom was right, adding it to the specification for DP 1.2a is purely a "put on paper" move.

      • xeridea
      • 6 years ago

      FreeSync would work for windowed video… it would be more of an issue with software/OS being aware of it.

        • orik
        • 6 years ago

        why would you ever want to use variable vblank intervals for a windowed application?

          • Duct Tape Dude
          • 6 years ago

          Same reasons you’d have it for a fullscreen application: fluidity and power saving. It’d just be up to the OS or driver to schedule the next frame.

          • Deanjo
          • 6 years ago

          Watching an embedded stream on a webpage for one….

    • Firestarter
    • 6 years ago

    I can’t wait for this to hit the shelves, it would be a great excuse to upgrade!

      • Corion
      • 6 years ago

      Excuse? Mandate!

        • JustAnEngineer
        • 6 years ago

        3840×2160 IPS or *VA displays that appear as a single tile and support DisplayPort 1.2a Adaptive Sync should be coming out just about the same time that we see higher-performing next-generation 20nm GPUs from AMD and NVidia. That sounds like an [b<]excellent[/b<] time to upgrade.

          • Firestarter
          • 6 years ago

          actually I’d even do TN, if the other options are unavailable or way out of my price range

    • PixelArmy
    • 6 years ago

    G-Sync : Adaptive Sync :: Mantle : DX12

      • xeridea
      • 6 years ago

      Mantle has dozens of products that use it, and doesn’t cost hundreds of dollars extra, Gsync has 0 use.

        • Deanjo
        • 6 years ago

        G-sync can be used on any software, the same can’t be said about Mantle.

          • UnfriendlyFire
          • 6 years ago

          You need a compatible monitor though.

            • Deanjo
            • 6 years ago

            And you need a AMD card to use Mantle. You even need a supported monitor for FreeSync

            • UnfriendlyFire
            • 6 years ago

            I don’t know that many monitor manufacturers that won’t support DisplayPort in the future.

            To get a comparison, USB3 was widely adopted a few years after its introduction.

            Intel launched Thunderbolt. I think you get the idea of what the USB3 vs Thunderbolt adoption rate is.

            • Ninjitsu
            • 6 years ago

            Well, I don’t know that many (related) software developers and GPU manufacturers that won’t support DX12 in the future.

            • shank15217
            • 6 years ago

            And an nvidia card which deanjo fails to say even though he just said freesync requires AMD cards.

            • Deanjo
            • 6 years ago

            I didn’t say you need an AMD card to support FreeSync. Read again.

            • Ninjitsu
            • 6 years ago

            He said [b<]Mantle[/b<] requires AMD cards (which it does). You need a compatible monitor [i<]and[/i<] graphics card for Adaptive-Sync, except there's no GPU vendor lock-in as with G-Sync. Exactly the same logic with DX12: Features of Mantle without GPU vendor lock in. I don't know why AMD locking us in is considered better than Nvidia locking us in.

            • Tech Savy
            • 6 years ago

            At this point regardless of who pushed the first Domino over I think the industry is headed for a Proprietary War. The two side will continue to present exclusive products for a while, pointing the finger at the other as a proprietary bad guy, one will probably be the victor but the market will be split.

            More proprietary hardware/software will allow us to experience more specialized benefits that are designed into the hardware in exchange for brand loyalty. However, I do not see any landslide victory so eventually their perspectives will change when they see they are putting more work into producing better hardware for a smaller audience.

            Then the return of standardization will come. If Windows is still the one holding the API standard torch at the end of this then they will have to me more diligent in acknowledging custom built cutting edge technology advancements into their API updates. Otherwise eventually someone new will have to hold the torch or this will happen again.

        • maxxcool
        • 6 years ago

        Mantle is only shipping for BF4 and Thief in non-beta… that is it.

      • cynan
      • 6 years ago

      G-Sync : Adaptive Sync :: Mantle : DX12 ::: Steamroller : PentiumD :: Early gen SandForce SSDs ::::: IBM Death star PentiumD:::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: Israeli/Arab conflict :::: A world plagued by corruption/greed/lust for power ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
      ::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
      getting a speeding ticket when following a line of cars that were also speeding.

        • lilbuddhaman
        • 6 years ago

        I believe a spam bot could have created a more readable, and pertinent to the thread comment than this one right here. What the :::::: is :::::::?

          • Corion
          • 6 years ago

          Duh!
          :::::: : ::::::: :: :::::::: : :::::::::

    • jdaven
    • 6 years ago

    Techpowerup is calling G-sync dead.

    [url<]http://www.techpowerup.com/200741/vesa-adds-adaptive-sync-to-popular-displayport-video-standard.html[/url<] Did anyone ever buy a G-sync enabled monitor or upgrade kit?

      • Airmantharp
      • 6 years ago

      I’m thinking that’s a little premature.

        • DancinJack
        • 6 years ago

        I dunno. Does G-sync work better than this implementation?

        Personally, if two technologies work just as well (for me), I’m probably going to buy the cheaper one.

          • orik
          • 6 years ago

          they work very differently.

          “In AMD’s implementation, VBLANK length (interval between two refresh cycles where the GPU isn’t putting out “new” frames, a sort of placebo frames) is variable, and the driver has to speculate what VBLANK length to set for the next frame; whereas, in NVIDIA’s implementation, the display holds onto a VBLANK until the next frame is received. In NVIDIA’s implementation, the GPU sends out whatever frame-rate the hardware can manage, while the monitor handles the “sync” part. In AMD’s the speculation involved in setting the right VBLANK length for the next frame could cause some software overhead for the host system. That overhead is transferred to the display in NVIDIA’s implementation. We’re looking forward to AMD’s whitepaper on FreeSync. AMD holds the advantage when it comes to keeping costs down when implementing the technology. Display makers have to simply implement something that VESA is already deliberating over. The Toshiba laptops AMD used in its FreeSync demo at CES already do.” (1)

          1: [url<]http://www.techpowerup.com/196557/amd-responds-to-nvidia-g-sync-with-freesync.html[/url<]

          • Voldenuit
          • 6 years ago

          Agreed. Also, it’s worth noting that G-Sync is proprietary on both GPU and display, whereas Adaptive Sync is a standard.

          Given the choice, I don’t think I’d pick the more expensive one that locks me into a GPU and monitor combo that may not even exist on the market in 2 years.

          And that’s speaking as a Geforce user.

          • Airmantharp
          • 6 years ago

          They both work equally well, as described- and actually, Adaptive-Sync might be superior because it eliminates the need for a second buffer in the monitor, as implemented in standard scalers and in G-Sync. This could have an impact in perceived input lag, though likely a very small one.

          However, it likely won’t be any cheaper than a fully commercialized G-Sync monitor, given that such a monitor would most likely have to include Adaptive-Sync as an operational mode, while still being equipped with a full-fat scaler and associated buffer in order to remain compatible with systems that don’t support Adaptive-Sync or that have DVI/HDMI/analog inputs as well.

          So Adaptive-Sync looks like the ‘better’ option, but it won’t be cheaper, except in the case of laptops where it’s already deployable.

        • Duct Tape Dude
        • 6 years ago

        Can you really blame them though? If your expensive proprietary technology is going to be competing with a free widespread standard, the only thing you have left is market lead time.

        Maybe G-Sync will adapt into an Adaptive-Sync compatible solution, but if everything remains as-is, it does look bleak for NVidia, unfortunately.

          • cynan
          • 6 years ago

          If the timing controllers in G-sync displays are only minimally more complex and thereby, would normally only be minimally more expensive to produce on a large scale then current versions, this means that the substantial increase in cost in a G-Sync display is due to:

          1) Nvidia royalties
          2) Manufacturers predicting that practically no one will be interested in adapted refresh rates and therefore only purchasing and implementing these somewhat more sophisticated controllers in low volumes.

          Eliminating the first will help resolve the second.

            • superjawes
            • 6 years ago

            The current increase in the cost of G-Sync displays is the fact that the timing controller is still an FPGA. FPGAs are very expensive because they are not custom chips, but instead can be reprogrammed to function differently if needed.

            • Airmantharp
            • 6 years ago

            This is the only reason G-Sync is ‘expensive’. In reality, a mass-produced ‘G-Sync’-style scaler ASIC would cost only very slightly more than the ones current monitors use, because the only major change is to allow refreshes on command over DP instead of running them according to a schedule set in the monitor’s firmware.

            • superjawes
            • 6 years ago

            Exactly. Now I’m not going to say that G-Sync [i<]won't[/i<] carry a premium (because you know that some executive is going to at least propose that idea), but the BOM price difference will be nothing once the ASICs are implemented.

            • Duct Tape Dude
            • 6 years ago

            I totally agree! Once the market matures G-Sync and A-Sync will be roughly equally as expensive to implement (barring, as you state, a branding markup).

            My point is that given a proprietary vs. standardized solution with identical capabilities, there are only two kinds of winners: the most widely available and the cheapest. Right now, G-Sync is more widely available and can enjoy a head start, but after the new DP proliferates, I think A-Sync may have a clear advantage (and thus I don’t really blame TechPowerUp for tolling the bell, though it is a little early).

      • Prestige Worldwide
      • 6 years ago

      They’re calling it dead for clickbait. Sensationalist article titles FTW!!!!!!!!!!!!!!!

        • jdaven
        • 6 years ago

        The funny thing is if you go to their mobile site, the ‘G-sync is dead’ part of the headline is not there.

Pin It on Pinterest

Share This