AMD could counter Nvidia’s G-Sync with simpler, free sync tech

CES — During an impromptu meeting in a hotel ballroom this morning, I got an eye-opening demo orchestrated by a senior AMD engineering executive. He had a pair of relatively inexpensive laptops sitting side by side running a simple graphics demo showing a windmill with the blades in motion. One of the laptops was using traditional vsync, only refreshing the display at a fixed rate, and the quantization effect of the fixed refresh cycle introduced obvious roughness into the animation. On the other laptop, however, the motion was much smoother, with no apparent tearing or slowdowns—much like you’d see from Nvidia’s G-Sync technology.

He explained that this particular laptop’s display happened to support a feature that AMD has had in its graphics chips "for three generations": dynamic refresh rates. AMD built this capability into its GPUs primarily for power-saving reasons, since unnecessary vertical refresh cycles burn power to little benefit. There’s even a proposed VESA specification for dynamic refresh, and the feature has been adopted by some panel makers, though not on a consistent or widespread basis. AMD’s Catalyst drivers already support it where it’s available, which is why an impromptu demo was possible.

Dynamic refresh works much like G-Sync, varying the length of the vertical blank period between display refreshes on a per-frame basis, so the screen can be drawn when the GPU has a finished frame ready to be displayed.

The lack of adoption is evidently due to a lack of momentum or demand for the feature, which was originally pitched as a power-saving measure. Adding support in a monitor should be essentially "free" and perhaps possible via a firmware update. The only challenge is that each display must know how long its panel can sustain the proper color intensity before it begins to fade. The vblank interval can’t be extended beyond this limit without affecting color fidelity.

In AMD’s assessment, it’s possible to reduce some of the problems with traditional vsync that Nvidia described in its G-Sync presentations—particularly the quantization effect felt most painfully at frame rates between 60 and 30 FPS—through the use of triple-buffering. Triple-buffering is a time-honored technique that can be implemented by a game developer in software or even enabled via a software switch in a graphics driver control panel. AMD used to have an option to force the use of triple buffering in its driver control panel, in fact, and would be willing to consider bringing it back.

The quantization problem can only be completely resolved via dynamic refresh rates. However, the exec initially expressed puzzlement over why Nvidia chose to implement them in expensive, external hardware.

The exec’s puzzlement over Nvidia’s use of external hardware was resolved when I spoke with him again later in the day. His new theory is that the display controller in Nvidia’s current GPUs simply can’t support variable refresh intervals, hence the need for an external G-Sync unit. That would explain things. I haven’t yet had time to confirm this detail with Nvidia or to quiz them about whether G-Sync essentially does triple-buffering in the module. Nvidia has so far been deliberately vague about certain specifics of how G-Sync works, so we’ll need to pry a little in order to better understand the situation.

Regardless, the good news here is that AMD believes a very effective G-Sync-like variable refresh technology shouldn’t add any cost at all to a display or system. The term "free sync" is already being spoken as shorthand for this technology at AMD.

That said, AMD is still in the early stages of cooking up a potential product or feature along these lines, and it has nothing official to announce just yet.

AMD believes the primary constraint in making this capability widespread is still monitor support. Although adding dynamic refresh to a monitor may cost next to nothing, monitor makers have shown they won’t bother unless they believe there’s some obvious demand for that feature. PC enthusiasts and gamers who want to see "free sync" happen should make dynamic refresh support a requirement for their next monitor purchase. If monitor makers get the message, then it seems likely AMD will do its part to make dynamic display synchronization a no-cost-added feature for Radeon owners everywhere.

Update: The original version of this story had some confused wording about AMD’s position on whre triple-buffering could be useful. We’ve updated the text to correct the error.

Comments closed
    • C-A_99
    • 6 years ago

    Triple buffering, by definition, adds too much latency to be of any use, unless there’s something I’m missing here.

    • Silus
    • 6 years ago

    LOL this is how AMD sounds like in almost EVERY tech article written about them:

    “Hey, you guys…Yes, you over there! Pay attention to us…I noticed you wrote article about <insert my competitor here> and I have to tell you…we just figured out a much better way of doing just that!!! We never showed it before because we wanted to surprise everyone, but believe it, out tech is much, much better and completely free! So stay tuned”

    Months and months go by and it is either never really implemented or is implemented, but falls into obscurity because it’s not well supported (due to poor drivers and such)

    Hey AMD, an advice: Actually do your work, have the product completed and running 100% (that means having proper drivers…and actual review samples of the hardware that aren’t different to the retail ones) and then you can do your press conference ok ?

    • tom_in_mn
    • 6 years ago

    For a constant frequency example, such as a spinning windmill, it seems clear that if you fiddled with the refresh rate you could find one that would give you a smoother looking display — just as you can pick some rotation rates that look bad on a given fixed rate display. From the description of the purpose of the dynamic refresh rate it does not seem designed to change rates on a frame by frame basis (effectively what G-sync is doing) but to slow down refreshes to save power over many frames. Thus it works fine for a demo with a single fixed frequency animation, but won’t work for frames generated from variable motions in game play, etc. I find it hard to believe that the Nvidia engineers would not understand the current display capabilities and miss the fact that it could be done without added hardware. Especially when AMD was happy to sell you a second graphics card and had not noticed that the frames rendered by the second card were barely displayed.

    • Modivated1
    • 6 years ago

    A demand for my next monitor? DONE!

    • alientorni
    • 6 years ago

    amd has listen to my words.
    a screen tearing and stuttering solution that doesn’t requiere a new exclusive hardware on gamers budget

      • Airmantharp
      • 6 years ago

      It’s probably a small improvement over V-Sync on it’s own, but it’s definitely not nearly as useful as G-Sync. But if it means that AMD has the foundation for enabling G-Sync on compatible monitors, then it’s a very good sign.

        • alientorni
        • 6 years ago

        [quote<]but it's definitely not nearly as useful as G-Sync[/quote<] you're just assuming knowledge that has not been given nor proven.

          • Airmantharp
          • 6 years ago

          If it involves turning on V-Sync, it’s very clearly inferior. This isn’t an AMD vs. Nvidia thing, this is a fixed vs. not fixed thing.

            • MrJP
            • 6 years ago

            But if the refresh rate is variable, then V-sync is no longer a problem, surely?

            • Airmantharp
            • 6 years ago

            We’ve yet to see- AMD hasn’t given us any information to refute the effects of V-Sync on input latencies, and has focused on the ‘smoothness’ factor that eliminates stuttering that V-Sync normally incurs.

            That’s all fine and good for games/situations where you’d want to use V-Sync, but a lag penalty would be a pretty big turn-off for shooters, so one can understand that the benefits of FreeSync might be limited for those that focus more on shooters than, say, MMOs or adventure-style games.

    • Chrispy_
    • 6 years ago

    I’ll take an open VESA standard over some annoying proprietary tech any day of the week, thanks.

    • farrengottu
    • 6 years ago

    Didn’t Virtu solve the whole frame timing issue. I felt they did. I don’t see why we need Nvidia’s expensive option or AMD’s free but latency increasing option.

      • Wildchild
      • 6 years ago

      There’s the issue with Virtu being a buggy mess and the fact they don’t push updates out fast enough. You can’t use Nvidia/AMD drivers that are newer than the latest version of Virtu. Here’s an example:

      3.0.108 | 32/64 bit: (Desktop, Windows 7/8/8.1) | [b<]Release date: Nov 27, 2013[/b<] 3.0.107 | 32/64 bit: (Desktop, Windows 7/8/8.1) | [b<]Release date: May 30, 2013[/b<] [url<]http://lucidlogix.com/support/download/driverdownloads-mvp2/[/url<]

      • superjawes
      • 6 years ago

      The frame timing latencies are different from display timing latencies.

      Display timing is about when a frame finishes versus when it is actually displayed. FreeSync and G-Sync address this with dynamic refreshing. The idea is that the monitor can wait to scan instead of using a fixed 60/120/144 Hz resulting in improved smoothness.

      Frame timing is to address large spikes in Scott’s “inside the second” graphs. FCAT catches these too. Basically, the goal of frame timing is to reduce or eliminate those spikes so the animation doesn’t appear to freeze (stutter).

      • l33t-g4m3r
      • 6 years ago

      I was wondering earlier if something like Virtu could be implemented in the driver, and idiots kept repeating the limitations of double buffering, 30fps, and vsync, when I wasn’t talking about that type of refresh syncing.

      I was right, it can and does exist. Virtual Vsync. If only this was directly implemented in the driver.

      We don’t need freesync or gsync, we need Virtual Vsync. That can be done right now on existing hardware with a 3rd party software update. If I was using an intel cpu, I’d fork over the money right now.

    • Fighterpilot
    • 6 years ago

    Free-Sync pwns G-Sink LOL
    Ouch, that’s gotta hurt.

      • Airmantharp
      • 6 years ago

      Just can’t stop shillin’!

    • renz496
    • 6 years ago

    so does this free sync already work with games? so far there were only demo with the windmill. when amd will show this tech working in games?

    • xeridea
    • 6 years ago

    I like how AMD is always coming up with ways of improving the game and computing industry, and Nvidia is always coming up with ways to sell expensive crap and proprietize everything.

      • indeego
      • 6 years ago

      This is a ridiculous statement. Both are for-profit companies that always have their best interest at heart. AMD’s move is completely to give pause to consumers before buying a Nvidia card. It’s too early to tell which one will advance gaming/computing.

        • clone
        • 6 years ago

        sort of but not really, the sort of is in the minor part where yes, companies are out to make money, (platitude) the not really is because while others are willing to give stuff away for free in order to raise their exposure Nvidia has historically been all about taking what they can and giving nothing back.

        look at this press release, if it was Nvidia saying it I’d know for sure they were trying to pooh pooh on AMD, it’s what Nvidia does.

        they have a history of trying everything from holding up a block of wood and calling it Fermi to denying access to review parts until journalists commit to openly promoting Nvidia exclusive features no matter how insignificant.

        AMD/ATI’s gfx history isn’t one of them being so much “mean” or “aggressively discounting” their competitor so much as being slower to market.

        you could be right, this could be an Nvidia like tactic but I doubt it….. I always felt G-sync was more tweak than revolutionary.

        • xeridea
        • 6 years ago

        Yes, the bottom line is always a factor. I am saying AMD came up with this technology years ago as an improvement, Nvidia came up with a way to sell a $200 chip. They are just saying they already could do what Nvidia is doing years ago for free.

        Its like years ago when Nvidia came up with CUDA to lock vendors into their cards, then AMD heavily promoted OpenCL.

        Reminds me of iOS and Andoid. Its absurd Apple has some proprietary connector, so they can sell you a $30 cable and a $50 charger, and 100% of other devices use microUSB so you don’t need 10 cables to carry around.

          • indeego
          • 6 years ago

          NVIDIA’s “chip” is purely optional. You can play 100% of games without it. It doesn’t raise the cost of graphics cards, but [i<]only some monitors[/i<]. The market decides what it wants. This could flop, and Nvidia could lose partners based on reduced sales, due to it, or gain traction. This is how Capitalism works. Why would NVIDIA brag about a technology that benefits their competitors? [quote<] Reminds me of iOS and Andoid. Its absurd Apple has some proprietary connector, so they can sell you a $30 cable and a $50 charger, and 100% of other devices use microUSB so you don't need 10 cables to carry around.[/quote<] It isn't absurd, it's marketing genius. Apple is a marketing company first, technology company second. (Microsoft, Google, Oracle are examples of the opposite.) Joe user doesn't care about the extra cost as long as it just works and it's got magic marketing behind it. They really don't. You or I do because we hate one company having control, or being forced into higher tiers of payment for a stupid connector. Joe User just wants their technology/status/whatever feeling elicits endorphins. To apple's credit, I have had some Micro USB chargers not work on some android devices, B&N Nook Color comes to mind. You are getting angry because companies are offering technology you don't want, and somehow you are taking it personally. Stop taking things personally. Use AMD because you hate Nvidia's attempt to improve [u<]their[/u<] market! [quote<]"Its like years ago when Nvidia came up with CUDA to lock vendors into their cards, then AMD heavily promoted OpenCL."[/quote<] And OpenCL is thriving, right? That is exactly how it's supposed to work. [b<]You incur risk when implementing something proprietary in any market.[/b<] See: Sony. It worked for them in some regards, burned them bad with betamax.

      • HisDivineOrder
      • 6 years ago

      I like how AMD is always coming up with ways of improving the game and computing industry AFTER nVidia comes up with a way that mirrors the way AMD just magically came up with, but somehow AMD’s way never seems to gain traction in the industry. Regardless of whether they’re (often slowly) responding to an nVidia technology or coming up with their own, their ideas tend to fizzle out.

      Examples of quick comebacks yet slow or buggy responses include PhysX vs Havok GPU-assisted physics (poof, it’s gone!), 3dvision vs whatever AMD’s approach was called, SLI vs Crossfire has been a joke at AMD’s expense until VERY recently after they finally fixed frame latency issues that have affected them for years, Geforce Experience vs Raptr, CUDA vs AMD’s lack of support for most of its alternatives, and Optimus vs Enduro. AMD never has had a response to Adaptive v-sync. Meanwhile, until very recently, AMD’s come up with Eyefinity (mostly matched within months and perfectly matched within a generation) and …?

      Now recently, AMD announced two mostly proprietary technologies. Don’t even say Mantle is open. It’s not. Not yet. AMD won’t even go on the record and say it will definitively be open. They definitely haven’t begun the legwork to building a standards body to run it. So Mantle is currently a closed standard that does nothing to improve the industry and may threaten to fracture it if it were to gain any traction.

      TrueAudio benefits mainly headphones users, which is a decent sized group, but audio technologies have traditionally had a hard time convincing users the way improved graphics do. Especially when said audio technology is not ubiquitous because even among a given GPU’s generation, there are only a small subset that include said feature.

      In the scheme of things, nVidia is leading the way in technologies that get real use and they seem to do a much better job of really making them work well. AMD has a habit of screaming about some “free” or “open” equivalent when nVidia announces something, only to act like an ADHD kid and wander off in a few months with nary a word about their “free” or “open” equivalent afterward. They did copy TWIMTBP pretty well this past year, but that only took them… what? Ten years?

        • l33t-g4m3r
        • 6 years ago

        Agree with most of this. AMD does support Adaptive V-sync. They just don’t expose it to the end user through the ridiculously bloated for all the features it exposes CCC. You can enable it through Radeon Pro though.

        I doubt Mantle will gain any traction if it isn’t an open standard, and nvidia’s taking the smarter approach to efficiency by sticking an arm cpu on the gpu. Nvidia will have mantle like results without using mantle, while AMD will only see those benefits in mantle supported games. Mantle also negates any arguments about nvidia proprietizing everything, because they’re both doing it now.

        I like the idea of TrueAudio, and it doesn’t only benefit headphone users. Anyone who’s heard A3d or EAX4+ would agree. The problem here is again support, and also compatibility. If AMD won’t allow nvidia users to run it with a second card, then it’s likely DOA. AMD’s best bet is to release a dedicated dsp card for non-amd users, which would maximize marketshare.

        I agree Nvidia implements features much better than AMD, aside from having a higher cost. Where’s AMD’s 3d vision or physics? Doesn’t exist. They kinda outsourced 3d to tridef, which is absolutely horrible. Took em long enough to fix crossfire, but I think freesync should arrive a little quicker, considering they already have a test demo. Monitor support is probably just as spotty on both sides, and IMO nvidia could support “freesync” as well, considering it’s part of the VESA standard, and not a proprietary amd feature.

      • jihadjoe
      • 6 years ago

      IMO Nvidia is actually ahead of AMD on the innovative ideas part, but they have an annoying habit of making their solutions locked down and proprietary.

    • jessterman21
    • 6 years ago

    So… True Adaptive Vsync?

    • mutantmagnet
    • 6 years ago

    [quote<]In Koduri's assessment, it's possible to achieve a G-Sync-like animation smoothness with a combination of two techniques: dynamic refresh rates and triple buffering. [/quote<] This would be immediately offputting for people who place way too much emphasis on input lag performance. [quote<] The only challenge is that each display must know how long its panel can sustain the proper color intensity before it begins to fade. [/quote<] This would be offputting to people who put too much importance on the color reproduction of IPS panels. [quote<]but he initially expressed puzzlement over why Nvidia chose to implement them in expensive, external hardware. [/quote<] When I was hoping for AMD to make an open standard to challenge gsync I wanted it to be as good if not better than it. Reading between the lines this isn't as effective and doesn't even sound like it's good enough for the "good enough because it's cheaper (TM)" standard. You have to ask yourself why monitor manufacturers you (correctly IMO) describe as too lazy to adopt new standards would bother using an overly expensive FPGA over the cheaper VESA standard. The only thing that makes me excited about this implementation is that Nvidia screwed up not being in any of the consoles and you could offer something that minimizes the poor frame rate drops that most likely will happen if the games on the 360 and PS3 are any indication.

      • LaChupacabra
      • 6 years ago

      I think you missed the point of both your…points.

    • superjawes
    • 6 years ago

    First off, let’s be clear that this is generally good news. I’ve said since G-Sync was announced that the idea behind it is a metaphorical genie in the bottle that would eventually get adopted everywhere. [i<][b<]IF[/i<][/b<] AMD can implement this or even add adoption, it would be great. However, there are some key points that raise red flags as to why G-Sync might still be better. For instance, there is a note about needing to know calculations in order to prevent color shifting. According to the writeup published last week by Scott, it sounds to me like Nvidia has already taken these into consideration, hence the external hardware. On top of that, triple buffering is [i<]not[/i<] how G-Sync is described. Triple buffering means putting completed frames into memory. You have one allocation for the current frame, a second for the next frame, and a third "just in case" buffer. From everything Nvidia has said, the GPU refreshes the monitor ASAP (immediately after the frame is ready to be displayed). This makes G-Sync superior because the system is trying to display the most accurate and recent information at any given time. If you throw in triple buffering, twitch FPS gamers (and probably all the pros) will turn it off for sure. Again, great news pushing variable refreshing to monitors, but I suspect there will be some advantages to G-Sync. Remember, it's only expensive because it is in the FPGA stage of development. Once Nvidia hammers out all the details, the G-Sync premium should approach zero.

      • dodozoid
      • 6 years ago

      [quote<] Once Nvidia hammers out all the details, the G-Sync premium should approach zero. [/quote<] Not entirely acurate... MANUFACTORING COST premium should approach zero, they will however probably let you pay for licencing as they do with SLI ready mobos...

        • superjawes
        • 6 years ago

        Depends on how they want to sell it. Even if they license, I suspect the user cost will be minimal, but what they can do is manufacture the ASICs themselves and sell them to monitor manufacturers. That would be very near zero in terms of price deltas because Nvidia would just be replacing what monitors already need.

    • puppetworx
    • 6 years ago

    Dear display manufacturers,

    Less G-Sync, more 120+Hz monitors.

    Cereally,
    PC Gaming Mustard-Race

      • superjawes
      • 6 years ago

      [url=http://i.imgur.com/c7NJRa2.gif<]Why not both?[/url<]

        • UnfriendlyFire
        • 6 years ago

        For $4,000, no problem!

        (I’d rather put that money in the stock market)

          • superjawes
          • 6 years ago

          Um, the early G-Sync monitors are all 120+Hz. [url=https://techreport.com/news/25857/27-asus-monitor-has-799-price-tag-g-sync-support<]Here's one for $800.[/url<] [url=https://techreport.com/news/25854/philips-intros-4k-and-g-sync-monitors<]And here's one for $650.[/url<] Now, 120 Hz + G-sync + 4k might be $4,000, but I've yet to see any of those.

            • derFunkenstein
            • 6 years ago

            And you won’t for a while. Bandwidth be killer, yo.

            • superjawes
            • 6 years ago

            Bandwidth wouldn’t be as much of a killer with G-Sync though, as the refresh rate just gets translated to a scan time, and lower scan times makes anything look smoother. 60 FPS will look smoother with a 8.3 ms scan time than a 16.7 ms one.

            But yeah, hopefully there will be a renewed focus on bandwidth as 4k gets adopted. Otherwise we’ll see little more than 30 FPS @ 4k =/

            • mcnabney
            • 6 years ago

            That’s what we have now with HDMI1.4. 4Kp30. DP can do 60 as can HDMI2.0. Nothing yet on the horizon for single cable 4Kp120. Maybe even up the color depth to 10 or 12 while you are at it too.

      • HisDivineOrder
      • 6 years ago

      Dear display manufacturers,

      Less TN, more IPS.

      With the Utmost Respect and Love Despite All Those Horrible Years of TN Panels Being Marketed For Gamers (Who You Must Believe Are Blind),

      The Real PC Gaming Master Race That Demands Proper Panels.

      PS:

      We know 120hz IPS is possible because hacked Korean cheapos can do it, so just install the controller boards that make it possible in all your monitors and stop futzing around.

    • derFunkenstein
    • 6 years ago

    If this has been part of the VESA spec for…a while at least, and vsync’s performance-limiting issues have been around forever, why are we just now getting into this? Why did AMD wait for NVidia to create some sort of competing proprietary non-standard before just fixing it?

      • Krogoth
      • 6 years ago

      Because frankly, not enough people care enough about it. It is rather trivial problem compared to the other graphical issues like not having enough processing power to handle 4Megapixel gaming at smooth framerates, reducing screen aliasing, improving texture quality, improving on shadowing and lighting effect etc.

      • DPete27
      • 6 years ago

      Because up until [url=https://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking<]a couple years ago nobody even knew about frame time variances[/url<]. Just shows how young the graphics world still is.

        • Airmantharp
        • 6 years ago

        Oh, we knew about them. We just had no real way to measure them. Nvidia was pretty clearly working on smoothing them out, too…

          • Krogoth
          • 6 years ago

          Experts knew how to measure it and resolve their issues. The problem is that hardly anybody care about it and shareholders of major graphical companies saw little reason to spend R&D until now. Nvidia needs something to hold the ethusisast company over in an age where integrated and lower-end GPUs are starting to become *good enough* and Maxwell isn’t ready yet.

            • Airmantharp
            • 6 years ago

            Must it be a conspiracy?

            Seriously, Nvidia had been working on getting frame-times straightened out for the better part of a decade. AMD acted like they’d never heard of them. Given that the results of taking care of the problem are very real, quantifiable, and the lack of said focus very obviously destroyed value, I’d say that plenty of people knew about them, and not nearly all of them being individuals that might be considered ‘experts’. The goons over at [H] weren’t being empirical about it, but they were highlighting AMD’s issues for a very long time, and were quite on point.

        • derFunkenstein
        • 6 years ago

        But everybody knew that Vsync caused tearing. So why not come up with a way to display the full framerate without the tearing? That’s obviously going to be super-dee-duper smooth compared to whatever Vsync you’re stuck at.

    • Tristan
    • 6 years ago

    Finally something good from AMD
    It is clear that g-sync is just costly workaround, designed to make complicate (and expensive) solution for simple problem. Hope that vblank-enabled monitors popularize quickly.

      • bittermann
      • 6 years ago

      +1. I’m surprised the Nvidia fan boys haven’t down voted the crap out of you yet.

        • CasbahBoy
        • 6 years ago

        Maybe it has to do with being the more simple, less expensive, and more open solution. Just a guess, but in technology circles that is universally thought of as “the right way” when it doesn’t impede on user choice.

    • setzer
    • 6 years ago

    I should also point out that at least for the i915 driver under linux (HD3000 for example) there is support for dynamic refresh rates from intel, the option is LVDS_DOWNCLOCK which by its the description is the same feature Koduri’s talking about.

      • willmore
      • 6 years ago

      Yep, I was just looking at that a week back. Sadly, my craptop uses the LVDS link instead of the eDP link, so it’s not available to me. 🙁

    • chuckula
    • 6 years ago

    Is it related to this? [url<]http://liliputing.com/2012/04/intel-future-could-use-less-power-panel-self-refresh-tech.html[/url<] If AMD can come up with a non-proprietary version of G-sync, then more power to them. However, you might still have to get a new monitor to support it.

      • Melvar
      • 6 years ago

      That’s the first thing I thought of when I read this, but I don’t think it’s related. What I think is going on here is AMD is buffering one frame ahead and timing how long it takes to render the next frame, then switching the monitor to the equivalent refresh rate before it sends the first frame. Obviously this requires the display to support dynamic refresh rate switching, but if it does, no further modification to the monitor should be needed. Do this every frame and you should have no stuttering and no tearing.

      Unfortunately, if this is how they are achieving this effect it will add a full frame of latency over what you would get with gsync/no vsync. That would be a deal breaker for any fast action game, but it might be better than nothing for slower games, and it could also be useful for playing videos that don’t match your normal refresh rate.

        • chuckula
        • 6 years ago

        [quote<] Unfortunately, if this is how they are achieving this effect it will add a full frame of latency over what you would get with gsync/no vsync. [/quote<] There are a few latency-adding bits in G-sync too IIRC. It's part of the tradeoff with synchronization vs. the absolute lowest latency (where you see tearing). One frame of video is reasonable if the results are smooth. I'd be very interested to see AMD push this and for at least Intel to get on board with its IGPs to put pressure on Nvidia.

          • Melvar
          • 6 years ago

          [quote<]There are a few latency-adding bits in G-sync too IIRC. It's part of the tradeoff with synchronization vs. the absolute lowest latency (where you see tearing).[/quote<] This is true, but from what I recall it's (supposedly) only a few ms compared to 25ms for a full frame at 40fps. The main thing I'm going by is the initial reports that performance and latency seemed as good as with vsync off. I guess we'll see how they compare when both are available for comparison. I have no idea how these tech sites are going to benchmark this. Inside the VBLANK?

    • sschaem
    • 6 years ago

    OEM should have no big problem selling freesync gaming laptops, considering the zero extra cost.
    Kavery laptop with gcn/mantle, true audio DSP HW, HSA and now freesync ? Humm.

    If bf4 really is 45% faster on kavery using mantle vs d3d… I see AMD having a very good 2014.

      • willmore
      • 6 years ago

      Yeah, I’ve always liked the idea of a being to game a little on my laptop, but ‘gaming laptops’ are hulking, expensive ‘desktop in a box’ machines that aren’t very functional laptops anymore. But, this…..

      eDP has supported variable refresh (does noone remember Intel talking about this a few years back?) for a while. They were pushing it as a power saving measure–for static displays on laptops.

    • l33t-g4m3r
    • 6 years ago

    I like how proactive and forward thinking AMD has been in supporting all these features through the driver that already exist in hardware. IMO, AMD should just open source the driver, and the community could do it for them. A good example is RadeonPro vs CCC. That or continue to play catch up to the closest competitor with actual vision.

    • Bensam123
    • 6 years ago

    ROFL freesync ><

    It would be quite hilarious if this can simply be implemented with a few tweaks. Although I think monitor manufacturers probably wont update the firmware on their monitors, that hasn’t stopped the DIY community from ducktapping things together.

    The only downside I could see about this is the buffer, which creates latency. If(!) G-Sync doesn’t use one, then it definitely gives it a leg up on freesync… but freesync also seems to be… free. I’m guessing this technology could also be improved to not use a buffer, since this was originally made for a different purpose… it can probably be reengineered. Since the results are so promising already I don’t doubt it’ll happen.

      • BlackStar
      • 6 years ago

      Triple buffering does not increase latency over double buffering, quite the opposite.

      Unfortunately triple buffering has been coopted by game developers to mean a longer swapchain, mainly because that’s what DirectX exposes. These two are not the same thing, but the terms have been so muddled over the years that now noone knows what they actually mean any more.

        • Bensam123
        • 6 years ago

        So it’s possible for tripple buffering to not incur a latency penalty if used properly with a different graphics API such as OGL or Mantle?

          • Melvar
          • 6 years ago

          Correct. Used properly it will increase frame rate (over double buffering with vsync enabled, not over vsync off) without increasing latency.

            • Bensam123
            • 6 years ago

            So it’ll still cause a latency increase, just not as much as vsync?

            That’s still increasing latency. :l

            • superjawes
            • 6 years ago

            Any time you have a buffer, you have a latency increase. VSync eliminates tearing by using at least two buffers. One is so the display always has a whole frame to display. The second is for the GPU to write to. The third buffer is useful when the GPU finishes writing that second frame during a scan. Without it, the GPU would have to wait for the scan to finish before the buffer became available again.

            And there is a latency penalty going to G-Sync over No-Sync, too, as No-Sync always displays the most recent frame information. This tears the image, of course, but you are still guaranteed that the on-screen information is (or was) the most recent.

            • Bensam123
            • 6 years ago

            Aye, that was my original point though… that there is a buffer and as such a latency penalty. Where as with the Nvidia solution, we may(?) be looking at a solution that doesn’t use a buffer to do what it does. He just made it seem like you could eliminate the latency penalty and we went in a circle.

            • superjawes
            • 6 years ago

            That’s why I’m suspicious about this as well. Everything Nvidia has been putting out there suggests minimal buffering, if any. If that is misleading, shame on Nvidia.

            However, if that is not the case, then this “FreeSync” demonstration would just be to distract and confuse.

      • HisDivineOrder
      • 6 years ago

      The thing is… why didn’t they push this before nVidia? Why always ride on nVidia’s coattails? Why wait for nVidia to market it before saying, “Hai guyz, we can do this!”

      They should have been talking about this a long time ago. Not waiting until the other guy starts his parade to stick on some reindeer ears on his Buick and say, “Hey, I got a free parade over here!”

        • nanoflower
        • 6 years ago

        I would guess it’s because they tried before (since they had it in their drivers) but didn’t find much interest from the monitor manufacturers. That may be because it was seen solely as a way to save power. It’s not known how much of a savings it would provide but my guess is when they studied the marketplace they didn’t find that having the feature made much of a difference to customers.

        Now Nvidia comes along and has put real money behind their solution (and may (speculation) even have paid ASUS a bit to buy in) and a new spin on it. Putting that spin that this provides better quality gaming graphics puts it into a different market. One where monitor manufacturers are already building special monitors just for gamers so adding in a new feature isn’t that big a deal.

          • superjawes
          • 6 years ago

          Well several years ago we were still talking about gaming graphics in terms of FPS. We still are, to some degree, but since the focus has shifted toward frame times, large latency spikes that were masked in FPS averages are now taking center stage.

          I see G-sync as an extension of that focus shift.

        • Bensam123
        • 6 years ago

        Maybe AMD doesn’t have as much manpower or coincidentally time to do things. Something about shaving off staff and breaking the bank.

    • cataphract
    • 6 years ago

    Anandtech has [url=http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014<]a video[/url<].

      • rahulahl
      • 6 years ago

      Yea, except they uploaded their 60FPS video on YouTube, which reduced it down to 30.
      I saw the video, and it looked useless to me. I couldn’t see any difference between the 2.

        • remon
        • 6 years ago

        They uploaded a 60 fps video on youtube, slowed down to half speed. Half speed of 60 is 30 fps. Such a hard concept to grasp.

          • rahulahl
          • 6 years ago

          Doesn’t change the fact that I didn’t see any noticable difference between the 2.

            • kn00tcn
            • 6 years ago

            then look closer? i dont know what to tell you… when he zooms in on the left one, it’s very stuttery

        • cataphract
        • 6 years ago

        Yes, it was stupid of them to upload it to youtube. Still, I can definitely tell the difference between the two.

        EDIT: strike that, it’s at half speed

    • Klimax
    • 6 years ago

    Koduri is either ignorant of NVidia’s tech (unlikely) or is trying to spread FUD and misinformation. (Almost lies)

      • B.A.Frayd
      • 6 years ago

      He’s certainly not ignorant. He knows exactly what G-Sync is, and why it is a such a threat to AMD’s bottom line.

        • Klimax
        • 6 years ago

        That’s why I put it as unlikely, but then why is he spreading misinformation and inventing “theories”?

        He is more or less contradicting many pieces of information already released by NVidia…

          • nanoflower
          • 6 years ago

          Why wouldn’t he do it when the marketplace still isn’t sure what G-Sync is. That’s the ideal time to strike and get people to think maybe this product isn’t right for them. Plus the fact that AMD has an alternative can also get people to hesitate to buy that Nvidia card and then have to spend extra money for the G-Sync monitor when their current monitor might work fine if they bought an AMD card.

            • Klimax
            • 6 years ago

            Well, I see no pitchforks against AMD for spreading FUD and misinformation, but whenever there is even just hint that Intel/Microsoft/NVidia might not be completely honest about competitor, then we see outcry.

            Bloody double standards.

            Also so far only talk, no evidence they have their solution at least as far as NVidia’s own… (like with everything, talk first and maybe deliver later)

            • travbrad
            • 6 years ago

            but…but AMD is our knight in shining armor. They will take us away to a magical land where everyone is happy and all of our frames render smoothly.

      • auxy
      • 6 years ago

      Why is this getting upvoted?

        • Klimax
        • 6 years ago

        Maybe, because I accurately describe his assertions and theories?
        (And frankly, votes don’t matter much…)

          • rxc6
          • 6 years ago

          Yeah, votes are kinda like your opinions.

            • Klimax
            • 6 years ago

            Votes are just stupid clicks, not so for mine posts…
            (Assuming that if you wanted to write generally, you wouldn’t include “your”)

      • Klimax
      • 6 years ago

      Hm, 16 bloody AMD fans…(at least) So how do you all explain massive discrepancy between NVidia’s statements and Koduri’s own assertions and “theories”?

      You all cannot escape fact, that he is spreading FUD and misinformation, nothing more.

      ETA: I must note that it shows further big bloody double standards. Par for the course…

        • tcubed
        • 6 years ago

        ok let’s just say I’m a an “amd fan” – you as the accusing part need to put forth the evidence for the “discrepancies” not us for rejecting your accusations, theories and opinions.

        And also please provide examples of “further big bloody double standards”…

        That is if you really want to be taken seriously…

          • Klimax
          • 6 years ago

          Half is already in reply to your other post. (BTW: My evidence against him are reviews and other write-ups on this tech)

          Double standards? Attacks at anybody who dares to question Mantle(in contrast to discussion about CUDA/PhysX), some replies to my post against Koduri. Frankly quite too many instances… (Just note: It was general statement and not meant to include only posters at TechReport)
          BTW: You can find quite a lot examples at Anandtech…

      • tcubed
      • 6 years ago

      please do tell us what about the article is “Amost lies”.. I checked nvidia’s g-synch info… and NOTHING is contradictory to the article…

      He presents a theory about the actual reason of g-sync… He doesn’t say it is so he even mentions:

      “I haven’t yet had time to confirm this detail with Nvidia or to quiz them about whether G-Sync essentially does triple-buffering in the module.”

      What is there not clear… he thinks and the amd guy thinks that the entire g-sync thing is because NVIDIA can’t do tripple buffering… which would really surprise me and I really don’t give much credence to this, I’m pretty sure that they could do it with a simple driver update if they wanted to. I simply think g-sync does the same thing as “free-sync” but being on the monitor itself control is much better and possibly smoother but I think it is just there for absolute enthusiasts gamers none of the regular joe gamers (99%+) will even consider paying extra for something that solves a problem they possibly probably don’t even know exists…

      On the other hand FreeSync for batterylife seems legit since the screen is usually the biggest power hog unless heavy gaming when it’s just the second biggest powerhog…

        • Klimax
        • 6 years ago

        First, I suggest to pay attention because I have never attacked article itself, but assertions by AMD’s executive about G-Sync.

        [quote<] However, the exec initially expressed puzzlement over why Nvidia chose to implement them in expensive, external hardware. [/quote<] First yellow card. NVidia explained what and why. Because ASIC in regular monitors lack necessary functionality. [quote<] His new theory is that the display controller in Nvidia's current GPUs simply can't support variable refresh intervals, hence the need for an external G-Sync unit. [/quote<] Second yellow card. Direct contradiction to NVidia's statements about whole setup. Also contradicts existing observations. (If the HW cannot support variable refresh rate then how it is any better then V-Sync?) Outcome? One red card and he is out of the field. (Football for you...) His "theories" are not based on any known facts, nor observations. They are simply failed hypothesis. As for FPGA and large amount of memory, it looks more like modified development board for FPGA then then anything else. Similar to this: [url<]http://media.digikey.com/pdf/Data%20Sheets/Altera%20PDFs/DK-DEV-5AGXB3N_ES_Web.pdf[/url<] (There were smaller then this two chip devkit, but can't find them anymore on Altera website.) BTW: Triple buffering wouldn't make much sense, because you need for static refresh only one image and in any other scenario it wouldn't get any use. Second BTW: As for gamers, even if they don't know, they would likely notice... (I notice such problems, but I don't like V-Sync so I ignore them. And then bugs in engines themselves can get in the way.)

    • Airmantharp
    • 6 years ago

    From what I can gather, and this is without official word, this technology decidedly isn’t like G-Sync at all; that it requires V-Sync to be enabled is also worrying. That would imply additional lag, the lack of which is one of the reasons that G-Sync is attractive.

    However, AMD’s ready implementation of ‘FreeSync’ (it’s in the drivers already) is a very good sign that AMD may be able to enable G-Sync compatibility on their current GPUs. And that would be really, really cool.

    • TwoEars
    • 6 years ago

    “In Koduri’s assessment, it’s possible to achieve a G-Sync-like animation smoothness with a combination of two techniques: dynamic refresh rates and triple buffering.”

    If that is true this is why nvidia assemebled a team of 20 engineers and they all concluded that they needed a hardware card in the monitor. Triple buffering may ok for average joe, but any profesional FPS or RTS player would rather do an entire tournament naked than playing with triple buffering enabled.

      • Grigory
      • 6 years ago

      I often play naked and I am not even playing professionally.

      • Krogoth
      • 6 years ago

      That’s entirely inaccurate. The majority of the progamers don’t care about input lag, buffering and tearing stuff. They only care about the only thing that matters, peripherals (keyboard, mice, controllers) and user interface (makes it easier to read and understand at a glance).

      They intentionally disable most of the eye candy effects to make it easier to see other players and use bright color skins for the models. They also like to dope up on caffeine and try to sneak in meth and amphetamines(a big problem in eSport arena).

        • Prestige Worldwide
        • 6 years ago

        I’m sure input lag is on the higher end of their concerns, nobody wants to lose because their monitor is slowing them down and making their gameplay less precise. Input lag is the difference between getting a frag or being toast in Quake Live, especially against a top player.

        I’m not “pro” but have played competitively in a number of fps games and leagues and input lag has always been one of the top criterias I look at when considering a new monitor.

        I know sponsorship has more to do with it than anything, but pretty much every MLG / Intel Extreme Masters PC tournaments exclusively use BenQ 120 / 144hz monitors, but those are the best monitors for a player in that use case (prior to G-Sync coming along).

          • Krogoth
          • 6 years ago

          Not even close.

          Progamers don’t rely on super-fast monitors or insane reflexes, although it can help. But it is not a deal-breaker.

          They rely on understanding game mechanics at an advanced level. They can *read* movements of their opponents and know how to counter them in fraction of a second. They understand the maps like the back of their hands and keep a close eye on item spawning time versus map control. They can easily estimate on many rounds it is required to kill their opponent based where they are on the map and what potential power-ups, health and armor they picked up.

          They often can accused by lesser players with maphacking and wallhacking. They also are fairly accurate not perfect thought and that’s why they aren’t aimbots.

          Input lag is a joke. The biggest bottleneck for input latency has always been the human body. The musculoskeletal system makes up most of it and remainder is the nervous system. The human body is so much slower than a modern computer and a decent monitor.

          That’s why progamers often go work out at gym and get into decent shape. They tend to be young as well, since one of the first things that starts to go with age is your body’s reflexes.

          They tend to consume stimulants which helps alleviate the latency issue on the nervous system’s end.

          Peripherals matter far more than the monitor because they are only means that you can interface with the computer itself. A good keyboard and mouse layout make a far larger impact than some super-fast monitor.

            • Firestarter
            • 6 years ago

            [quote<] because they are only means that you can interface with the computer itself [/quote<] yeah and big input lag puts a big damper on ALL inputs, regardless of how fancy your mouse is

            • Krogoth
            • 6 years ago

            Nope, the nervous system and musculoskeletal system are much slower than a decent monitor. That’s why progamers tend to be young and keep themselves in shape. They also partake in the consumption of stimulants (legal or illegal).

            • OhYeah
            • 6 years ago

            Krogoth, on some of the IPS/PVA monitors the input lag is so noticeable that you have to take it into account and adjust your gameplay according to it. Absolutely no pro-gamer will touch a non-120hz monitor for a crucial game because your giving your opponent a major advantage.

            • Krogoth
            • 6 years ago

            Again, it is utter non-sense and at best it is nothing more than a placebo at work.

            The human body is so much slower then a computer and how fast a monitor draws pixels onto a screen. It is like the difference the random access speed of a HDD versus a SSD. They aren’t even close as a result SSD runs circles around the poor HDD.

            It is much easier to blame the computer and monitor for “perceived” shortcomings rather than pointing the problem at yourself. It is a natural coping and defense mechanism of the human mind.

            Input lag at best is a minor issue at high levels of play. Map control, tactics, strategies, physical reflexes and layout of keyboard/mice are much larger factors. This is what separates the winners from the losers in tournaments.

        • OhYeah
        • 6 years ago

        120hz monitor with low input lag is as important to gamers as are the mouse, keyboard and headphones. I could play with a crappy headset and a crappy non-mechanical keyboard, but I’m never going back to a 60hz monitor.

      • superjawes
      • 6 years ago

      I don’t think Koduri’s assessment is accurate. The whole point of G-Sync is that it [i<]doesn't[/i<] buffer frames, resulting in fast, recent information without tearing. If Nvidia is using triple buffering, then they are basically lying to everyone about how the system works. Btw, I imagine many/most pro gamers will turn G-Sync off, too. "No-Sync" will still display the fastest information at the expense of tearing. G-Sync's smoothness would have to offer better games awareness (which is hard to measure).

        • setzer
        • 6 years ago

        If you have read the known bits about G-Sync, you should know that they actually buffer frames, that’s the whole point of the massive ammount of memory the hardware has:

        [quote<]The first-gen G-Sync module is pictured on the right. The biggest chip on the module is an FPGA, or field programmable gate array, which can be made to perform a range of custom tasks. In this case, the FPGA is serving as the development and early deployment vehicle for G-Sync. The FPGA is accompanied by a trio of DDR3 memory chips, each 256MB in capacity. I doubt the module requires all 768MB of memory to do its thing, but it likely needs the bandwidth provided by three separate memory channels. Nvidia tells me this first G-Sync module can handle 4K resolutions at refresh rates up to 60Hz.[/quote<] When, the card doesn't have a frame ready by the time the display needs to refresh it will render the last frame in the buffer again. Scott wrote that also in his preview of G-Sync. So in fact you have something akin to triple buffer.

          • superjawes
          • 6 years ago

          Nothing in that quote suggests that G-Sync uses triple buffering to achieve smoothness. Yes, G-Sync will display an old frame if the frame rate [i<]is below 30 FPS[/i<], but when you're above that threshold, the monitor will [b<]wait to scan[/b<] until the GPU tells it to, which is when the GPU finishes the new frame. At most, you should only need one buffer to implement this, and that's just in case the new frame takes more than 33.3 ms to render (slower than 30 FPS). Triple buffering has three buffers (obviously), and is used when the refresh rate is fixed. One is for the current scan--what is being shown on the monitor. The second is for the next frame. When that next frame isn't ready, the monitor will repeat what is on the first buffer while the GPU continues to draw the new frame. If the GPU finishes the next frame while the monitor is still scanning the first, it will write the next next frame to the third buffer. That way, when the frame rate is less than the monitor's refresh rate, the GPU always has an open buffer to draw to, improving the effective frame rate. I am not discounting the possibility that Nvidia is misleading everyone on the implementation, but all the excitement about G-Sync revolves around the purported immediacy of the implementation--that the monitor will display new frames as soon as they are ready, but without tearing.

      • dpaus
      • 6 years ago

      [quote<]why nvidia assemebled a team of 20 engineers and they all concluded that they needed a hardware card in the monitor[/quote<] We don't know that any such conclusion was reached at all. It is far, far more likely that Nvidia simply decided to implement an expensive, proprietary solution as a profit-maximizing 'solution' and [i<]then[/i<] hired the 20 engineers to implement a patentable (i.e., protectable) product.

      • erwendigo
      • 6 years ago

      “Triple buffering may ok for average joe, but any profesional FPS or RTS player would rather do an entire tournament naked than playing with triple buffering enabled.”

      Triple buffering doesn´t introduce more lag than other vsync techniques, in fact it has a minor lag than double buffer. Only Vsync OFF have a very little less lag than the use of triple buffering, and a zero real advantage when you are using a 120 Hz (8,3 ms per frame, a typical case of 4,15 ms of extra lag, or added delay, with triple buffering) monitor.

      It´s irrelevant, when the working of the panel electronics, or the IOs between human-machine are in other level of timing.

      • WaltC
      • 6 years ago

      Oh, gee…Surely you know the answer to this already…?

      Q: “How many nVidia engineers does it take to screw in a light bulb?”

      A:

      (Fill in the blank.)

      • sluggo
      • 6 years ago

      Because when you sell hardware you make money.

    • TwoEars
    • 6 years ago

    It’s going to be a good year to be a pc enthusiast.

    I can feel it already.

    Haswell-E, X99, DDR4, G-Sync, 1440P, 4k, Maxwell, dynamic refresh rates…. give me all of it.

      • Krogoth
      • 6 years ago

      Mostly refreshes of old archtectures and designs with more digital roids thrown in to make them *faster*.

      G-Sync and such is actually old tech. The vast majority don’t care enough about tearing and other syncing issues with RAMDAC/TMDS and buffering to make video industry go address it until now. Nvidia marketing drones saw it as a new marketing pitch to retain enthusiast interest in discrete GPUs. AMD is just going “me too” as a counterattack and avoiding missing the train.

      IMO, 2014 is going to more of the same evolutionary improvements on existing designs and archetypes.

      • indeego
      • 6 years ago

      None of what you post will really make too much of a difference in your day to day, except maybe “4K,” of which all the other won’t be able to push pixels to fast enough to make a difference at anything under $2K. So you are essentially saying “Give me new technology this year for $2,000+ at 30 fps or below.” The 780 ti ($700) can’t push to a “4K” (not UHD!) screen at 30 fps with Battlefield 4.

      • HisDivineOrder
      • 6 years ago

      I’d bet the Maxwell we care about is going to be delayed. I suspect this mostly because I doubt AMD will field much before the end of the year and Intel seems destined to release an even less impressive CPU upgrade for the mainstream crowd than last year, which may encourage nVidia to stick with refreshes and rebrands until AMD decides to show up with some new technology.

      Why release more than you have to, especially when the fab process is still so new and expensive?

    • Firestarter
    • 6 years ago

    [quote<]In Koduri's assessment, it's possible to achieve a G-Sync-like animation smoothness with a combination of two techniques: dynamic refresh rates and [b<]triple buffering[/b<][/quote<] But won't triple buffering introduce quite some input lag? Surely this cannot be what they want to do.

      • Meadows
      • 6 years ago

      I meant to press +1 but misclicked.
      I agree. (Unless “triple buffering” means something radically different to AMD.)

        • Melvar
        • 6 years ago

        I don’t get how triple buffering would even apply. My understanding of it is that when you have a fixed refresh rate and vsync is enabled, the front buffer has the frame that’s currently displayed, the back buffer has the next frame that’s being rendered, and if that frame is finished before the next vblank the third buffer lets the system start rendering the frame after that without waiting for the buffer flip. If you don’t have to wait for vblank then what will the third buffer be used for?

          • Zizy
          • 6 years ago

          Yeah, triple buffering has no effect with this and frame rates under monitor refresh rate. Current frame is in the first buffer, next frame gets rendered and sent immediately when done.
          However, above 60FPS, triple buffering might actually have some slight benefit (while Free Sync and G-Sync have no effect). Frame 1 is currently being displayed, frame 2 has been rendered and waiting, frame 3 is being rendered. If frame 3 is ready before next monitor refresh, 2 is discarded and 3 put in its place. If not, frame 2 is displayed.

          Still, gimme 1kHz+ OLED monitors already (and obviously this tech, this time for power saving purposes).

          • erwendigo
          • 6 years ago

          One front buffer (display), two back buffers. The 2 back buffers (2 BBf) are a mechanism to make the gpu works without any stall/stop with the render of new frames, without any effect related with the Vsync. The gpu renders new frames to the BBf, first one and then in the second one, and repeat the mechanism while the Vsync signal doesn´t arrive.

          When a new Vsync happens, the most recent of the BBfs (the one with the last rendered frame) is selected as the new Front Buffer, and then the other BBf and the olf Fronf Buffer are the 2 BBfs.

          The mechanism certifies that with all the vsyncs, you´ll have a most recent renderer frame available (except in the case of a frame that last more to render than the complete cycle between vsyncs).

          Double buffer doesn´t certifies it. And it introduces stalls to the gpu. The Triple buffer doesn´t stop the gpu at all.

            • Meadows
            • 6 years ago

            THERE ISN’T A VSYNC SIGNAL!

            For pity’s sake, please don’t just read, but comprehend. Variable “vblank” means the GPU NEVER WAITS for vsync, because it comes on demand!

            So what *is* the third buffer for?

            • erwendigo
            • 6 years ago

            The third buffer is there because with it you have a mechanism that in every moment the gpu are working procesing a new renderer frame.

            You don´t have a standard VBLANK signal, BUT you have a VARIABLE VBLANK signal that, you can decelerate/accelerate, but you can´t full control over it (if you want a perfect image displayed by the panel, you need to don´t override the minimal latencies of the panel for the “response times” of the pixels, and you can´t wait without any VBLANK all the time that you want, because the image of the panel needs a refresh every X time).

            You can accelerate or decelerate the VBLANK, but you can´t full control of it. So you need a mechanism that guarantee that you have the gpu busy and with a disposed frame buffer, the most recent one, to send to the monitor. Tbuffer guarantees this. An with variable VBLANK you can eliminate the minimal latency of the waits to a standard VBLANK.

            Think, your monitor with variable VBLANK has a range of refresh of (20-60 Hz), you can select every timing in this range to send a image to the panel (you can select to wait between refresh 16,6-50 ms). So you need a algorithm to select the correct timing when you have a new renderer image.

            You send a first image, then your system and gpu ends a second one 5 ms later, but you can´t send this one to the monitor NOW because your refresh imposes a minimal latency between frames of 16,6 ms.

            If you have a double buffer system, then you can´t do nothing except wait, have a unbusy gpu, and send 11,6 ms later a frame that doesn´t represent the best timing in the game engine for this exact moment.

            But if you have a t-buffer system, you can work in a third frame, ends it 5 ms later, and go to the next one and renders it with 5 ms more. You have a frame that is ahead of the previous example 10 ms in the timing of the game engine, and your panel is going to make a refresh (1,6 ms) now.

            If you can´t tearing/vsync off , EVER you’ll have a very little latency. Tbuffer + variable VSYNC guarantees a very little one over standar VSYNC. GSYNCm almost sure, have a internal mechanism of Tbuffering. This doesn´t mean that nvidia cards can´t use variable VBLANK like says the AMD’s worker, or that the Tbuffering by soft are useless, is a more “simple” workaround to the problem of DX and its lack of T-buffering. The nvidia solution is more transparent to the game/application.

            • 0g1
            • 6 years ago

            Great post! Explains why a GTX 700 series is required for G-Sync — because it has some hardware support for triple buffering.

        • TwoEars
        • 6 years ago

        Fixed it for you. Yes – if indeed is triple buffering it becomes kind of pointless.

        • BlackStar
        • 6 years ago

        Indeed, but its game developers and gamers that misuse this term.

        Melvar got it right. Triple buffering+vsync actually decreases latency over double buffering+vsync.

          • Firestarter
          • 6 years ago

          But it still has more latency than no Vsync at all, right?

            • superjawes
            • 6 years ago

            Correct.

            No VSync results in tearing because the GPU will display new frames mid-refresh, so you have a mixture of information on the screen.

            VSync will only display whole frames, so the tearing goes away. However, when a new frame is late, the monitor will show frames more than once, essentially displaying old information even if the second frame is finished during the refresh. Triple buffering doesn’t fix this, but it improves overall frame rates by queuing up finished frames.

            G-Sync is a great idea because it [i<]doesn't[/i<] queue frames, at least as Nvidia has described. They basically turn off the fixed refresh on the monitor and display new frames as soon as they are available, but whole frames are displayed instead of tearing the image mid refresh. The result is a minimal lag increase over "no-sync" while still maximizing new information on screen.

            • BlackStar
            • 6 years ago

            Actually, triple buffering *does* improve latency if your rendering speed is higher than the display speed. Consider, for example, a 60Hz monitor coupled with a GPU capable of 120fps.

            Double buffering:
            – monitor at frame 1; gpu at frame 2; gpu wait
            – vblank
            – monitor at frame 2; gpu at frame 3; gpu wait
            – vblank
            – …
            Latency: 16.6 ms

            Triple buffering:
            – monitor frame 1; gpu at frame 2; gpu at frame 3; gpu wait
            – vblank
            – monitor frame 3; gpu at frame 4; gpu at frame 5; gpu wait
            – vblank
            – …
            Latency: 8.8 ms

            Latency is the amount of time between the last world update (i.e. beginning of frame #) until the moment this update reaches the monitor.

            This only helps when the gpu can run faster than the refresh rate. If they run at the same rate, or if the gpu is slower, then there is no difference between triple buffering and double buffering.

            Here is where gsync or free sync comes in: instead of waiting for a vblank signal, the gpu pushes out a frame as soon as it is ready. You essentially remove the gpu wait -> vblank part from the processing chain. Before the only way to do this would be to disable vsync. The difference is that gsync / free sync don’t result in tearing – best of both worlds.

            Again, if your card is faster than your monitor, then this doesn’t really make a difference. The point of gsync / free sync is to get smoother framerate from slower cards.

            • superjawes
            • 6 years ago

            I never said anything about triple buffering being better or worse in terms of latency 😉

            Really, any sort of buffering (storing information in memory before displaying it) means that there will be an added latency in the system. That’s why I suspect that pro gamers will skip G-Sync, as “no-sync” means the information is always accurate.

            And just so it’s out there, triple buffering does improve effective FPS when GPU FPS < refresh rate.

            • erwendigo
            • 6 years ago

            With many games the vsync off make stuttering very visible. The triple buffer eliminate in these cases the stuttering/microstuttering of the others solutions, included vsync off (only some games, but it´s posible).

            The added time for the T-buffering is irrelevant. With a 60 Hz monitor is around 8 ms, with a 120 Hz monitor is around 4 ms.

            • superjawes
            • 6 years ago

            Check out the stuttering in Scott’s slow motion videos. Triple buffering doesn’t eliminate stuttering because it still has to display an old frame when a new one isn’t ready, giving extra display time to that image. It is better than normal VSync, which can delay the GPU when the back buffer is already filled.

            • erwendigo
            • 6 years ago

            “Check out the stuttering in Scott’s slow motion videos. Triple buffering doesn’t eliminate stuttering…”

            These videos weren´t working with triple buffer. Direct3D doesn´t support directly Triple buffering, only double buffering, and very few games have implemented triple buffering in their “Vsync Options”. You can force the Triple buffering with third party programs, but this isn´t the case of the Scott’s videos (that they show the nvidia vs AMD frametiming issues/solutions).

            Triple buffering “cheats” the game engine and “show” to it a permanent state of “morrrrr frames”. So, the game engine works at 100% speed, with the minimal delay that it´s posible (frame to frame like if all of them goes to the display) and the triple buffering works like a indirect “frame pacing” workaround because of this.

            There are some delay, but minimal. And the games works very well with T-buffering if there isn´t any incompatibility.

            Ex:

            Assasin’s Creed 2 and derivatives:

            Vsync in the game -> shame on you. Stuttering, and the fps oscilated between 30-60 fps.

            Vsync Off -> shame on you. Many frames per second but stuttering (like the game engine works with a prefixed fps, or a multiple fps for internal timings).

            Force tripple buffering -> smooth like butter.

            With games like these ones, the T-buffering works like a frame-pacing algorithm, because they needs a “fixed vsync” to work flawlessly. But a Vsync that doesn´t jump abruptly from 60fps to 30 fps and viceversa.

            And ALL of this with a nvidia card/driver, the issues aren’t all related with AMD and its drivers, of only with the gpu makers.

            • superjawes
            • 6 years ago

            Triple buffering doesn’t eliminate stuttering because it doesn’t eliminate quantization. With VSync, ANY time your frame rate is under the refresh rate, you get double exposure of a frame (at least). The effect can be small, or it can be quite large. Having an extra frame buffer isn’t going to speed up the rendering, so a frame that takes several refresh cycles to render is still going to result in a big stutter. All triple buffering does is increase the effective FPS over a larger time step*, so you can sometimes get three unique frames over four refresh cycles instead of two.

            I was referring to [url=https://techreport.com/review/25788/a-first-look-at-nvidia-g-sync-display-tech/3<]THESE slow motion videos,[/url<] by the way. The "buttery smoothness" of G-Sync is achieved by synchronizing the refresh without doubling up frames. EDIT: *measuring FPS at all is generally a poor idea because it doesn't tell the whole story. This is why TR started doing time sensitive metrics to begin with. Large latency spikes (in the rendering) will ruin a smoothness because they stick out so badly.

            • erwendigo
            • 6 years ago

            The Vsync Off doesn’t eliminate quantization too. You know, you send frames, the atomic and indivisible particle of imaging with monitors (and no, tearing of various frames isn´t a continuum). Between the events of two continous frames you have a quantization of the internal time of the game.

            If you refer to the minimum timing needed to wait before sending the frame and not about the quantization of frames “per se”, well, this HAPPENS with FreeSync AND with G-Sync. They can´t reduce this timing to zero, because this only was posible with a panel with INFINITE responsiveness (the panels have a minimal wait or maximal ratio of refresh in the panel-display).

            Timing of a panel, EVER is > zero.
            Frames EVER are a quantization of the game, EVER and with a timing that is result of the cowork of cpu and gpu.

            And yes, this video are different that the another ones that I thought. But isn´t related with the goodness of triple buffering. Skyrim is a D3D game, it doesn´t use triple buffering at all, and in this test Scott doesn´t say nothing about it.

            And, I´m not saying that triple buffering is equal or better than Gsync, neither if you compare it with FreeSync. It has a little effect with the distribution of the frames with the correct timings (the correct timing is “zero” timing after the rendering, and any vsync have a penalty, bue Tbuffer very little one). But is a “good enough” technique very cheaper than these two other techniques. If the developer implements it in a engine, is much better than other Vsync.

            Finally, the smoothness of Gsync isn’t about “not doubling up” frames, is about to eliminate the lag of the waits to Vsync and the timings of the frame that the machine sends, finally, to the monitor (tbuffer is very much better than double buffering when the gpu is fast enough to fulfill the two buffers and enter in a stall state, tbuffer continue to work and, with luck, when vsync arrives, it has a very recent rendering frame in one of the BBfs).

            • Meadows
            • 6 years ago

            No, I’d love G-sync, *without* vsync. That’s the way I’ll roll once I can be bothered to replace my display. (Probably H2 2014.)

      • erwendigo
      • 6 years ago

      No, triple buffering is better than the alternatives techniques of vsync. Only Vsync Off has a advantage of a inferior added lag, but a little one.

      • sunaiac
      • 6 years ago

      Triple buffering is just a way to swap buffers.
      It usually means lag because you work on fixed frame times of 16ms, which means between 16 and 33ms of lag.

      Now, imagine the base value at which buffers can be swapped comes down to 1ms, I don’t think you’d feel lag 🙂
      Especially since triple buffering is not 2 frame prerender, so useless frames can be dropped.

    • SilentViking
    • 6 years ago

    Is this different from the Dynamic V-Sync implemented in Radeon Pro?

    [url<]http://www.radeonpro.info/features/dynamic-vsync-control/[/url<] Obviously not including the monitor support. Just curious if Radeon Pro is bringing a hidden feature forward. Edit: nvm, answer is yes it's different, I forgot to read.

      • Melvar
      • 6 years ago

      That seems to be AMD’s version of what Nvidia calls adaptive vsync; it just turns vsync on or off depending on whether your current framerate is higher or lower than the monitor refresh rate. It’s not at all the same thing as gsync/free sync.

    • Krogoth
    • 6 years ago

    I rather wait for VESA group to come up with something so it would be more vendor agnostic and have a better chance of being picked up by the industry then being stuck as a niche.

      • ibnarabi
      • 6 years ago

      the vblank signal is already part of the VESA display port specification, which is why AMD has supported it for years.

      • Krogoth
      • 6 years ago

      Nvidia shills detected

        • Airmantharp
        • 6 years ago

        Butt-hurt AMD apologist?

        I’m as excited as the next guy that AMD might be able to provide a comparable solution to G-Sync, or otherwise enable compatibility with it, but right now ‘Free-Sync’ as envisioned doesn’t seem to do much to help the overall problem.

          • rahulahl
          • 6 years ago

          To be honest, if this does work like G-sync as far as the smoothness is concerned, even with the v-sync lag I would be happy with it.

          I might not be a typical gamer, but I don’t mind playing games like counter strike, tf2 with v-sync.
          Probably because these games run well enough to generate over 200 fps easily.
          I really hate the stuttering and tearing in games. I wouldn’t mind having a free solution even if it’s not as good as G-sync, if it’s free and can be enabled on a decent number of existing systems.

          When I buy a new monitor, I certainly would consider the G-sync factor vs the price, but it’s good to know that if I can’t find the monitor I want with G-sync, I can still make use of this free-sync.

            • Airmantharp
            • 6 years ago

            Remember that the monitor still has to support Free-Sync; there’s a reason AMD used laptops for their demonstration, and nothing comes for free :).

            To that end, for said games, using V-Sync puts you at a severe disadvantage with respect to input lag. Tearing sucks, sure. But dying constantly due to input lag sucks more.

            • Krogoth
            • 6 years ago

            You are dying because of internet latency, server/client prediction and your own body being too slow.

            Input lag from the monitor is a small portion of the equation and is the least of your concerns in a twitch shooter.

            • Airmantharp
            • 6 years ago

            Ah, but your opponent is suffering from the same problems too- but if you’re running V-Sync and he isn’t, you’re putting yourself at a disadvantage. It may not be much, but in my decade and a half of playing online shooters, it has always made a difference.

            • BlackStar
            • 6 years ago

            Don’t worry, you’ll grow out of it soon enough.

            • Krogoth
            • 6 years ago

            Not much of a disadvantage. There are so many other larger factors that influence online gaming. I have played enough online games and almost all of my latency related issues stem from internet connectivity related stuff. Not from the output of the monitor.

            That’s why LAN parties still retain their appeal. They eliminate almost of the issues associated with the internet. You can still run into networking related issues on a poorly managed LAN that is hosting 100+ nodes.

            • Airmantharp
            • 6 years ago

            I see where you’re coming from; it’s the same thing we’ve been fighting since dial-up became fast enough for real-time gaming.

            But enabling V-Sync results in your system always feeling ‘a step behind’. Counter-Strike, Battlefield, whatever else is hampered just enough to make you feel like you’re playing in mud.

    • Melvar
    • 6 years ago

    I really hope this works out. Variable refresh rates are something I’ve wanted since long before the G-Sync announcement, and even though I currently have an Nvidia card the idea of them owning this concept until a patent expired was really troubling.

    That said, I remain skeptical about this really being the same thing but somehow without extra hardware. I also don’t buy the idea that Nvidia cards can’t send frames at variable rates when that seems to be exactly what they’re sending to the G-Sync hardware.

    Maybe one of the G-Sync monitor vendors will announce that they’ll support Free Sync as well. It would be fun to watch Nvidia try to spin that.

      • l33t-g4m3r
      • 6 years ago

      Why can’t nvidia cards support freesync too? Gsync might work a little better, but free is free.

        • BlackStar
        • 6 years ago

        It’s a matter of the hardware supporting variable refresh rates. Apparently AMD hardware has supported this since the 6xx0 series, whereas Nvidia hardware doesn’t.

          • l33t-g4m3r
          • 6 years ago

          Vblank is part of vesa. I don’t see why nvidia wouldn’t support it, although that might be possible. Nvidia could have just went the gsync route for performance reasons, but we won’t know until later.

            • BlackStar
            • 6 years ago

            Nvidia supports vblank. All PC hardware does, since thirty years ago.

            The question is, can the hardware generate vblanks at a *varying rate* without a full mode switch (where the screen goes blank)? AMD apparently added this capability as a power-saving measure a few years ago. It’s unclear if Nvidia has done the same.

            The monitor also needs to support this feature, but apparently this is a matter of firmware. Exciting!

          • spiked_mistborn
          • 6 years ago

          Nvidia doesn’t support it because I don’t think they though of this until March 2013 when they stole my idea for gsync here [url<]https://techreport.com/discussion/24553/inside-the-second-with-nvidia-frame-capture-tools?post=719842[/url<] Never even got a "thanks" from Nvidia for "borrowing" my idea. They must have started working of the fpga the next day so they could get it out quickly.

            • MathMan
            • 6 years ago

            The VESA standard already supported the feature of syncing a display to the GPU. And as a board member of the VESA organization, Nvidia must have known about this because one of the tasks of board members is to vote for new standards.

            Nvidia already had a product with the name G-SYNC in their Quadro line (They have apparently renamed it to Quadro Sync, but there are still many links to it as G-SYNC.) I didn’t check this, but it stands to reason that they already had a trademark filed for it. So reusing that name should have been by far the cheapest option.

            And, finally, this technology gets rids of (v)ertical sync and replaces it by syncing to the (g)pu. So G-SYNC is a pretty obvious name.

            Can you remind me what Nvidia should be thanking you for?

            Or was you this just posted as a joke?

            • l33t-g4m3r
            • 6 years ago

            Nvidia is all over the references in the dynamic frame rate patent. I think the probability they support this in hardware is very high.
            [url<]http://www.google.com/patents/US20080055318[/url<]

            • spiked_mistborn
            • 6 years ago

            Damn, they beat me to it by a few years, and they had everything I thought of in March: variable refresh based on content, frame buffer in the display, only sending changed information and not the whole scene for bandwidth and power savings.

            So the question is, if they had this patent since 2006 then why are they just now coming out with a product? Why make us suffer with tearing or vsync quantization for all these years?

    • Essence
    • 6 years ago

    WOW – That`s so bad for people who already got G-spanked

      • Melvar
      • 6 years ago

      You mean those poor, troubled souls that already get to play with this feature without having to wait for it to be released? We should all feel sorry for them.

    • UnfriendlyFire
    • 6 years ago

    Well that’s a way to take a dump in the G-Sync party’s punch bowl.

    Oh well, free features through driver updates? I take.

    Any estimates of how long it will take for AMD’s driver developer to implement it? The frame buffering meter took a little bit of time anyways.

      • Essence
      • 6 years ago

      Its already a VESA standard and AMD have said they have had this tech including patents for Dynamic frame rate adjustment since 2006, so should be quick if not instant support from AMD. I just hope Monitors start pushing this who are not with Nvidia e.g. Samsung, Dell, HP etc. etc. if not already supported, or maybe a bios update is all that’s required (more info)

      [url<]http://www.google.com/patents/US20080055318[/url<]

      • Pantsu
      • 6 years ago

      It’ll most likely require a new revision of the monitors at least, and of course AMD exposing the feature in CCC. I think the problem is not that it’s hard to add the dynamic VBLANK support, but that the dynamic refresh rate will impact colors, and making everything work properly requires some effort. AMD first needs to partner up with some display manufacturer to implement the feature before others will bother to support it. If AMD doesn’t have an exposed feature ready, why would anyone bother? How long it will take for AMD to do this? who knows. I doubt it’s anytime soon though, but perhaps they’ll have something out this year.

Pin It on Pinterest

Share This