Here’s 240-FPS footage of AMD’s FreeSync tech—and some new info

We’ve been hearing about FreeSync, AMD’s answer to Nvidia’s G-Sync variable refresh display tech, for just over a year now. This week at CES, we finally got a chance to see FreeSync in action, and we used that opportunity to shoot some enlightening 240-FPS footage. We were able to find out some new specifics from AMD, as well.

The demo was running on a 4K display with a peak refresh rate of 60Hz. If you maximize the video above, you can see the refresh rate reported in the bottom-right corner. Without FreeSync, the display is pegged at 60Hz, and the rendered scene tears quite a bit. (The effect is particularly visible around the windmill’s blades.) With FreeSync on, the display hovers between 44 and 45Hz, matching the scene’s frame rate and eliminating tearing altogether.

As we learned earlier this week, five display makers have FreeSync-certified monitors in the works. Some of those monitors will mirror the 4K resolution and 60Hz cap of the demo unit, while others will offer refresh rates as high as 144Hz and resolutions ranging from 1080p to 2560×1440. Minimum supported refresh rates will vary from display to display, but the technology can go as low as 9Hz, we’re told. The supported maximum is 144Hz.

Certification of FreeSync monitors will be handled by AMD directly. The company says it wants to ensure its brand is synonymous with a “good experience.” The certification process will be free of charge, the company tells us, so it hopefully won’t add to the cost of FreeSync panels. That said, AMD says its drivers will also allow variable-refresh mojo with non-FreeSync-certified panels, provided those panels support the DisplayPort 1.2a Adaptive-Sync specification. One such monitor will be Asus’ MG279Q, which we saw earlier this week. With or without certification, though, a FreeSync-capable Radeon GPU will be required for variable refresh rates to work.

FreeSync panels entered mass-production last month, and AMD says 11 of them will be available by the end of March. Asus’ MG279Q, too, is due out late in the first quarter. The number of capable displays will grow to as many as 20 by the end of the year, AMD predicts, so there should be a nice stable of them for Radeon users to choose from.

Comments closed
    • sparkman
    • 5 years ago

    Any info or predictions out there about whether existing DisplayPort cards such as my GTX770 will become compatible with FreeSync via a driver upgrade?

      • Pwnstar
      • 5 years ago

      No. Remember, not even most of the cards AMD sells are compatible.

      • renz496
      • 5 years ago

      AFAIK Adaptive Sync/FreeSync specifically need DP1.2a to function. if simple driver updates can make DP1.2 into DP1.2a AMD would be the first to make the noise about their GCN 1.0 parts like 280X etc capable of using FreeSync just with update driver alone. also it is well known that nvidia have no interest to make adaptive sync work with their card.

    • Sabresiberian
    • 5 years ago

    Does the part where Freesync is turned on look slightly blurrier to anyone else?

      • Jigar
      • 5 years ago

      Go home Nvidia, you are drunk.

        • Sabresiberian
        • 5 years ago

        You assume everyone else is prejudiced because you are.

        All I was doing was asking if others saw what I did. If you didn’t then the proper response is something like “No it doesn’t look blurrier to me after Freesync is turned on”.

        Do I favor Nvidia? I favor having PhysX and that’s really all anyone gets over what AMD offers right now. But it certainly isn’t a must-have for me – in fact it can be a detriment in some games because it can add too much to the screen. Borderlands is an excellent example. And at the current prices there is no way I can honestly recommend a GTX 980 over a Radeon 290X. I would not recommend the 970 to a Borderlands fan like myself because the 900 series Nvidia cards do not perform as well as the 290X does in that game (unless they have fixed the issue the 900 series cards have with Borderlands. Pretty silly though that AMD cards out-perform Nvidia running an Nvidia optimized game.)

        I also favor companies pushing us beyond current graphics limits. AMD had the Freesync technology but only used it to smooth out performance in low powered notebooks – what’s up with that? It took Nvidia developing their own solution for us to move forward. I’m not a fan of G-Sync in that it is currently proprietary and costs way too much money. I’m solidly against being locked into one brand of video cards because I bought a monitor that is only supported by that one brand. I wish AMD had recognized the potential of what they came to call Freesync years ago, but now they have, and now we have a new standard everyone can adhere to without raising the cost of monitors more than a few dollars (unless of course manufacturers decide to gouge the public).

        AMD brought us Mantle, which forced Microsoft to finally get off its butt and spend some real development time and money on DirectX improvements that we should have had years ago. That in and of itself puts AMD at least on an equal innovation footing with Nvidia, in my mind. It shows me that while they might have been slow in introducing a technology they actually already, they DO care about the quality of what we see – and they certainly care about supporting game devs, which of course ultimately supports gamers.

        So, yeah, saying I’m a drunken Nvidia fanboy is, uh, inaccurate, to say the least. As far as I’m concerned both companies have their faults, both have goods points, and neither one rises its head above the other, overall. I buy what is best for me at the time I’m making my purchase; neither company has proven to me that it deserves my loyalty over the other.

          • Pwnstar
          • 5 years ago

          You obviously favor nVidia, despite protests to the contrary.

    • itachi
    • 5 years ago

    What about input lag is it as bad as with normal Vsync by the way ? always wondered.. would be nice to see an article addressing this issue :p.

    Also, as awesome as it is, I also got that feeling like… holy, they only try to fix this whole Vsync mess in 2014-15 lol !

      • Pwnstar
      • 5 years ago

      You know, you could always just not run Vsync if you care about latency that much.

      • Tirk
      • 5 years ago

      Here is a link from AMD’s website that explains how it approach’s vblank and what it does for input latency:

      [url<]http://support.amd.com/en-us/search/faq/226[/url<] Take from it what you will.

    • Bensam123
    • 5 years ago

    Queue the fanboi decrees of how horrible AMD is for bring such technology to market and how G-Sync will be a superior product as all those filthy muggle monitors wont support it… And AMD will lock Nvidia out because Nvidia wont choose to support it too! Those bastards (AMD)!

    Nvidia could choose not to support this to try and perpetuate G-Sync awhile, until Intel supports it then it’s set and match.

      • Prestige Worldwide
      • 5 years ago

      Depends is freesync does the job as well as gsync. I will buy the superior technology and freesync has yet to be proven by reputable reviewers.

      • chuckula
      • 5 years ago

      Intel will probably support Freesync at some point since Freesync is an extension of some of the earlier panel self refresh technology that Intel has been trying to push to save power in notebook/tablet devices. That also explains why the earliest Freesync demos were done using laptop displays since desktop displays didn’t yet have the needed features.

      [url<]http://liliputing.com/2012/04/intel-future-could-use-less-power-panel-self-refresh-tech.html[/url<]

      • Sabresiberian
      • 5 years ago

      I always listen to people who call someone with a differing opinion a “fanboi”. Such original thinking, such enlightenment! Such genius in being able to lump valid arguments with unreasonable comments in order to have an excuse to ignore them all!

    • renz496
    • 5 years ago

    Also this made me thinking. When Intel going to support adaptive sync?

      • semitope
      • 5 years ago

      I’m more interested in whether my gtx 970 already supports it and nvidia is just being lame to not announce. Otherwise, AMD will be my choice next time around (unless nvidia does with newer GPUs).

        • renz496
        • 5 years ago

        I thought that nvidia has mention in the past that they have no intention to support Adaptive Sync. Even 900 series only support DP1.2.

      • wierdo
      • 5 years ago

      Yeah if Intel does that, that’ll settle the debate in one stroke; their integrated market presence would rapidly drive adoption at the OEM level like a wildfire.

        • EndlessWaves
        • 5 years ago

        Depends on the software support.

        As a technology it works best at full screen and outside of gaming there aren’t a lot of applications that run full screen. Video playback might be a killer app, particularly in former PAL regions operating at 25/50hz, although Adobe aren’t exactly known for being quick to adopt new technology.

        I don’t know whether it would be possible to have different parts of the screen refresh at different rates or whether you’d have to default to 60hz when two refresh-sensitive windows were playing at once. Either way I suspect any sort of support for windowed applications to be a long way off.

          • Bensam123
          • 5 years ago

          If it’s truly ‘free’, then low end and monitors of all shapes and sizes will support it. It will make it’s way into ‘tick box’ territory in which the other guys simply don’t have it and generally speaking it’s something good to have (if it’s free).

    • Klimax
    • 5 years ago

    I see once more, more wrong info being passed around.

    Reminder:
    [url<]https://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo[/url<] [quote<]However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand. That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction. When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That's why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists. [/quote<]

      • Phartindust
      • 5 years ago

      “To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, “we would know.””

      Either he is lying, or uninformed.

      Back in September Realtek, Novatek, and MStar – scaler manufacutrers, announced support for Freesync.

      [url<]http://www.bit-tech.net/news/hardware/2014/09/22/amd-freesync-deal/1[/url<]

        • anotherengineer
        • 5 years ago

        And from that article.

        “The new scalers, promised to hit the market by the end of the year, will support FreeSync and Adaptive-Sync as well as the usual features of picture scaling, on-screen displays, DisplayPort High Bit-Rate Audio and legacy HDMI and DVI input types – although these, naturally, won’t include the adaptive refresh technology. ”

        The last 9 words are key I think.

          • EndlessWaves
          • 5 years ago

          Adaptive Sync is part of the DisplayPort spec. Of course DVI will never support it. HDMI may add it in the next version, but all this has come about since HDMI’s last update so it doesn’t currently.

            • Kurotetsu
            • 5 years ago

            From what I understand HDMI will never have it either because, like DVI, it operates on a fixed clock rate. Only DisplayPort is immune from that requirement, which is needed for adaptive sync.

        • renz496
        • 5 years ago

        And when Peterson said that there is none exist. I think that is early 2014. AMD come up propose the spec for DP1.2a more towards the middle of the year. Then we heard about the scaler in in September. This is the updated scaler that will support Adaptive Sync and FreeSync.

      • puppetworx
      • 5 years ago

      Thanks for the text wall but what specifically is wrong that’s being said?

    • Klimax
    • 5 years ago

    Finally something tangible. Felt like waiting for Godot… Now how it will handle irregular cases in games and we are set for showdown. (Remember, here we have constant pattern, in games it will be usually all over place and it will be more stochastic making things harder)

    • odizzido
    • 5 years ago

    Gsync will have no bearing on my next video card purchase, however supporting DP 1.2a will. Get with it Nvidia.

      • UnfriendlyFire
      • 5 years ago

      I know someone who only buys Nividia GPUs because of PhysX. I wish I was joking.

        • JustAnEngineer
        • 5 years ago

        We have a few commenters that are compensated with NVidia swag to be anti-AMD, but the most obnoxious ones are probably just fanboys.

          • Klimax
          • 5 years ago

          Popular assertion which is usually without shred of evidence making it just poor Ad Hominem…

    • wololo
    • 5 years ago

    To anyone saying AMD should have showed a game-centric demo, they did. At the very least they did a Tomb Raider benchmark demo, if not more. I think Linus featured it on his channel.

    • renz496
    • 5 years ago

    so when TR can make extensive test and do comparison with G-Sync? hopefully we don’t have to wait for monitors to be widely available first. can TR ask AMD to supply the monitor to do early review?

    • ipach
    • 5 years ago

    AMD should enable this in their APUs, and by extention, to gaming consoles. and while they’re at it, put it in new Sony TVs or something. i’d trade smart tv features or 3D for tear-free gaming smoothness any day. In fact, wouldn’t be too much unlike Sony of old to scoop up the exclusive access to the tech for their TVs and call their TVs the best place to play.

      • Rurouni
      • 5 years ago

      Kaveri APU is supported. The only problem is that it only works with DP. I use Kaveri, but my mainboard doesn’t have DP, thus I need to buy a new GPU.
      Because of that, although the hardware probably can support adaptive sync, console won’t have it because they don’t use DP. Maybe Sony can release a new PS4 with DP and at the same time release TVs with DP. Until then, no adaptive sync on console.

      • puppetworx
      • 5 years ago

      You’re dead right about Sony. This is exactly the technology that consoles and TVs need but it requires someone like Sony to implement DisplayPort on both ends for it to work.

    • kilkennycat
    • 5 years ago

    When TR gets a FreeSync monitor for evaluation, I suggest one of the user-perceptual tests be the Mordor benchmark. This benchmark includes a running on-screen graphical profile of FPS. And there are a bunch of SHARP changes in FPS during the benchmark. I have a G-sync monitor… regardless of the sharp changes, the display is always visibly buttery smooth (V-sync off). Not so with my replaced fixed-60Hz monitor. Similarly on a bunch of other gaming benchmarks (Thief, Arkham, etc ) The windmill demo in the above article obviously has insignificant graphical computational complexity to disturb its frame-rate and thus is no real test of dynamic challenges to FreeSync. Why no gaming benchmark demo at CES ?? Does the Emperor have any clothes ? Maybe FreeSync is intended solely for streaming-video applications unable to keep up with a 60Hz refresh-rate on a 4K monitor and not for gamers at all ?

    Obviously, TR will take FreeSync through a battery of more technical tests, but the overall perceptual result on computation-challenging graphical material will be one ultimate measure of its success with the PC action-gaming enthusiast. The other key measure will be input-latency, critical for competitive action -gaming. No need for triple-buffering with G-sync.

    • wingless
    • 5 years ago

    I have a 144Hz G-Sync ASUS. One of the main reasons I switched to Nvidia was for 144Hz and 3D capabilities. G-sync was a side benefit that turned out to be the best feature one can imagine. Freesync may have me take a good hard look at a 290X (an upgrade from my current GTX 760) since their price has dropped so low. AMD is narrowing the amount of killer apps Nvidia has…

    • Tristan
    • 5 years ago

    This windmill demo is nonsense. Framerate do not change, and we do not know, if monitor can adapt flawlessly to changing framerate. This is real technical challenge, than just lower vertical frequency to fixed level. In many FPS games framerate may change many times per second (caused by fast rotations, explosions, etc). Is this monitor able to change framerate that quickly ? We do not know. This is suspecting, that they not showed how FreeSync perform on real demanding FPS games.

      • Firestarter
      • 5 years ago

      I agree, even though it isn’t completely nonsense, this windmill demo only really shows that Freesync delivers on one basic promise in a well controlled static environment. The real challenge, as you say, is how well it holds up in games.

        • Wild Thing
        • 5 years ago

        It was tested on two games according to Jarred Walton at AT…(and worked satisfactorily)

          • Mandrake
          • 5 years ago

          AMD was showing off demos that they explicitly chose and strictly controlled. I choose to err on the side of caution and wait until TR and others have the opportunity to carefully and methodically review and scrutinise this technology before arriving at an opinion of it. I think that’s just common sense.

    • Srsly_Bro
    • 5 years ago

    So I need a new video card to use this tech, can my HD 7950 even output 4k resolution?

    • Krogoth
    • 5 years ago

    Great, LCDs are starting to simulate what good CRTs used to do.

      • oldog
      • 5 years ago

      Do OLEDs do this or is this behavior more typical of LCDs?

      • jts888
      • 5 years ago

      sorry, but you’re confusing CRT multisync with variable refresh intervals in LCDs.

      CRTs needed ~100 kHz flyback transformers to drive their horizontal deflection coils and up to ~200 Hz for vertical deflection.

      No CRT in existence could dynamically change the vertical refresh circuit resonance fast enough to account for variable frame rate nor change electron gun power fast enough to suppress the brightness flickering that would result.

        • Voldenuit
        • 5 years ago

        Voldenuit is not impressed at Krogoth’s misconception.

      • Firestarter
      • 5 years ago

      so your CRTs never showed any tearing? Must have been nice, living in the future

        • Krogoth
        • 5 years ago

        It is only noticeable if you pan the screen around like a hype-up kid on energy drinks. It is very noticeable on most LCDs units even with slow-paced animation. TNs, IPS LCDs with a strobing backlight managed to avoid most of it.

        The difference is that I don’t make a huge fuss over it.

        It is almost as silly as audiophiles being ultra-picky about wooden knobs and gold-plated cable connections. The masses are in the same camp. It is a vocal minority of videophiles that are crying about it like it is the end of the world.

          • Ninjitsu
          • 5 years ago

          Lots of noticeable tearing while playing the original NFS Most Wanted at 60-85 fps.

          Other games too, but I can’t remember specifics.

          • Firestarter
          • 5 years ago

          We were playing Quake and Unreal and the like, of course we were panning around like hyped-up kids on energy drinks, because we were!

          As for your second argument, I kind of see where you’re coming from but I still completely disagree with you. Yeah, there are many situations where tearing has never been a major issue, not even for the most nit-picky of us, but that doesn’t excuse the times that tearing and juddering absolutely does mess up the visuals and whatever the game designers were trying to convey with them. It’s an error that has existed for decades for no good reason other than that it was the way things were before and they deemed backwards compatibility more important than quality. It’s high time that this gets fixed and I look forward to getting it fixed for my own system. You can keep using your old monitors, they’ll be no worse then than how they are right now.

            • Krogoth
            • 5 years ago

            It is never has been an error. It is a more like shortcomings of a legacy throwback from late 1960-era when video display technology for computer was in its infancy.

            The industry didn’t bother to look back until now, because it wasn’t that big of an issue to begin with. It is only coming back because GPU-guys are running out of steam with increasing performance and shrinking the GPU. So they need to look elsewhere to themselves in business. Behold, they decided to look back on how mointors and video cards work and address the shortcomings on the ancient standards.

          • TardOnPC
          • 5 years ago

          For digital audio the quality of cable is irrelevant. For analog well, try plugging your instrument up to an interface with a coat hanger and tell me it sounds the same as a nice Mogami cable.

      • rahulahl
      • 5 years ago

      CRT while good, were horrible on the eyes.
      At 85Hz, I could still see the flickering. At 60, the flickering was so much magnified, that I got migraines from it.
      Granted, it might not affect everyone the same way, but at least for me.. LCD are way better than CRT.
      And I would rather deal with tearing than flickering any day.

        • odizzido
        • 5 years ago

        60hz CRTs actually burn my eyes. I have to push them to 75 for them to be usable for more than thirty seconds, and 85 for more than a few hours. At home I always ran my CRT at 100hz to be comfortably usable. Plasma displays have the flickering as well I find, though for me it feels around the same as a 90hz CRT so they’re not horrible.

        LCD monitors I can use for as long as I want and never have any issues though. LCD is the superior technology for my eyes.

        • Ninjitsu
        • 5 years ago

        Hmmm. I’d notice 75 Hz, not 85 Hz. 85 Hz was fine for me. 60 Hz was horrible.

          • Generic
          • 5 years ago

          Beat me to it, I never had issues with 85Hz either.

          rahulahl must have [i<]special[/i<] eyes. [url<]http://youtu.be/-J9V-pV5BSE[/url<]

      • Meadows
      • 5 years ago

      CRTs never had dynamic refresh rates.

    • anotherengineer
    • 5 years ago

    First time I’m getting excited about PC gear since I installed my SSD over 3 years ago.

    If that BenQ XL2730Z is indeed an AHVA (IPS type) screen with 144hz and adaptive sync from 30hz to 144hz, and the price is right I may pull the trigger on it. (after reviews of course)

      • dodozoid
      • 5 years ago

      If it doesent cost a kidney, it is my next screen

        • pdegan2814
        • 5 years ago

        I’m thinking the same thing. Not sure if I’ll have any great need for a high refresh rate, since I’ll more likely be using it to smooth out the images at <60fps. But I’ll hate giving up 16:10 if they don’t leave me any choice, it’s amazing what a difference that extra bit of vertical space makes.

          • derFunkenstein
          • 5 years ago

          This is going to be more than just a little extra bit of vertical space. Whatever you have, this is like 1/3 more height.

      • anotherengineer
      • 5 years ago

      Well apparently the Acer will be AHVA (IPS) type viewing angles with 144hz refresh rate.

      [url<]http://www.tftcentral.co.uk/news_archive/32.htm#acer_xb270hu_ips[/url<] Happy Happy Joy Joy Hopefully BENQ and others use this panel, get some variety. Edit - it will be interesting if they offer a non G-sync option to see the actual price difference.

    • HisDivineOrder
    • 5 years ago

    Glad it works.

    If AMD wants this technology to become a real standard used by other companies, they really should stop calling it Freesync and call it by the name everyone else will use. Kinda reminds me of how HDMI-CEC is branded by every TV/device vendor as some special name even though they’re just CEC-based.

    If the goal is to get the technology out there used broadly, then make it simple and call it VESA’s Adaptive Sync (which is already tragically/comically/tragicomically similar to nVidia’s previously named Adaptive V-Sync) because that’s the standard.

    Continuing to use Freesync is essentially AMD trying to scream to the world, “WE MADE THIS! WE MADE THIS! IT WAS OURS!”

    It’s like a child ignoring the forest for the trees. The best way to get it in use is to get nVidia and Intel on board ASAP. Trying to market a standard you helped engineer as if it were a proprietary technology is probably the best way to keep everyone else away for as long as possible.

    Whereas if Adaptive-Sync were just another standard, nVidia wouldn’t feel pressure to stay away. If it becomes a “thing” for AMD, then nVidia is likely to stay away as long as possible and continue pushing G-Sync. That may sound like a win for AMD, but it isn’t. It keeps at least some vendors still pushing out G-Sync monitors, which keeps the whole adaptive sync monitor thing divided.

    And I’d really rather they weren’t divided at all. I just want them to move on, go all in on one side, stop blathering on about proprietary/branding nonsense (on both sides), and get me a cheapo monitor with 1.2a based on an IPS panel with 4k for sub-$500 at 32″. Doesn’t have to do more than 60hz, but it does need to be IPS and 30-ish inches.

    Bonus points if it’s 16:10.

      • jts888
      • 5 years ago

      If I need to buy a new monitor and GPU to take advantage of variable syncing anyway and the monitors for one solution cost $170-$200 more than the other with minimal if any performance differences, I’m just going to get the cheaper monitor and apply my savings towards an even beefier and/or additional GPU.

      Nvidia needs to give up on their display-side custom hardware ASAP unless they can upgrade the functionality dramatically and soon.

        • Ninjitsu
        • 5 years ago

        Yeah, that’s been my view too. I have a 60 Hz 1080p IPS display, and I have a GTX 560. To get either FreeSync or G-Sync I need a new (and expensive) monitor and a newer GPU.

        Rather just get a GTX 970 or better and enjoy minimums of mostly 60 fps.

          • pdegan2814
          • 5 years ago

          Except some of the newest games won’t let you push 60fps at 1080p even on a GTX970, not consistently, unless you turn down the visuals. Yes, some of that is because some of the big games to come out recently were riddled with bugs upon release, but even after several patches, you are NOT going to max out.

      • smilingcrow
      • 5 years ago

      ‘ get me a cheapo monitor with 1.2a based on an IPS panel with 4k for sub-$500 at 32″ ‘

      Sounds like you’ll be buying a direct import off eBay at that sort of price.

      • Kretschmer
      • 5 years ago

      So you’re blaming AMD for Nvidia’s hypothetical stubbornness and want a fairy-dust cheap IPS 32″ 4k display? That’s a downvote.

      They’ll both end up supporting 1.2a. Nvidia will call it G-Sync 2.0; AMD will call it Freesync. I don’t begrudge AMD their lone moment of competent marketing, as it’s led to a consumer-friendly standard being developed. I’ll probably wait for Nvidia to support 1.2a before I buy into the new standard, but props to AMD for getting the ball rolling.

        • sweatshopking
        • 5 years ago

        Downvoted for whining.

          • Kretschmer
          • 5 years ago

          Severely disappointed with lack of caps, but upvoted for thematic consistency across posts.

      • Krogoth
      • 5 years ago

      16:10 is going to remain professional-tier only, so units are going to carry that premium.

      16:9 has taken over the gaming and mainstream space.

      • GhostBOT
      • 5 years ago

      “Glad it works.” ?

      You say, as if you werent expecting it to work?

        • Pwnstar
        • 5 years ago

        That’s his bias speaking.

    • DPete27
    • 5 years ago

    Free-sync monitors aren’t even available for retail yet and I’m already upset about the whole Free-Sync vs. Gsync dilemma. Consumers shouldn’t have to lock themselves into one GPU manufacturer for the life of a monitor (which tends to be pretty long) to get variable refresh rates. I think the industry needs to pick one or the other QUICKLY and unify everything.

      • jts888
      • 5 years ago

      The industry will be making a choice by default towards Adaptive-sync since the major scaler ASIC vendors are rolling out the functionality into all their new chips, which cost really nothing at all more to manufacture while retaining broad functionality.

      G-sync monitors only work with Nvidia cards, can’t have any HDMI/DVI/VGA/whatever inputs for using cable boxes or consoles, and cost too much extra.

      The fight will be over very quickly short of miraculous new functionality from Nvidia.

        • UnfriendlyFire
        • 5 years ago

        There’s a reason why Firewire and recently Thunderbolt lost against USB.

        Because USB is cheap to implement and can be jammed into small circuits. Those other two ports aren’t cheap, and I think Thunderbolt had more restrictions on how its circuits could be arranged on a motherboard.

    • Voldenuit
    • 5 years ago

    I’ll be curious to see how Freesync and Gsync compare on a latency basis, but there’s no way I’m paying an extra $100-150 for a Gsync monitor unless the difference is dramatic.

      • wingless
      • 5 years ago

      The batch of Freesync monitors TR reported on from CES just a couple of days ago have surprisingly low prices. It seems that the current hardware in a lot of displays already support the feature, it just isn’t enabled in firmware. The result of that is NEW displays hitting the market have the feature enabled at no additional cost to the manufacturer or consumer….unlike G-Sync (which I have and love).

        • Voldenuit
        • 5 years ago

        I was talking about G-SYNC displays being $100-150 more expensive, since they have to use a custom FPGA.

          • wingless
          • 5 years ago

          Sorry about my reading comprehension. Yes, I have to agree. I think Nvidia has more nuanced control since they’re using high power FPGAs and onboard memory or whatnot, but AMD and VESA accomplished the same in a much easier way. G-Sync, although a stellar technical feat, isn’t worth the premium at the end of the day.

      • Firestarter
      • 5 years ago

      yeah, all this video shows is that it works, but it doesn’t show how [i<]well[/i<] it works. If it adds latency or can't cope well with highly variably framerates, then G-Sync would still have a leg up on Freesync. I just want to see these two technologies compared to each other with latency measurements

      • Krogoth
      • 5 years ago

      Don’t waste your breath.

      Your hands and arms are way slower than a decent LCD unit at moving pixels.

        • Ninjitsu
        • 5 years ago

        But what about our eyes, good sir.

          • Krogoth
          • 5 years ago

          Visual cortex has its own limitations.

            • thor84no
            • 5 years ago

            You forgot to move your hands mysteriously while spewing your weasel-words.

            • pandemonium
            • 5 years ago

            I’m confused how you can be simultaneously praising CRT technology for having AFPS all along and also saying how our visual acuity and motor reflexes aren’t capable of noticing or quick enough to respond to it anyways?

            You crack me up, man.

            • Krogoth
            • 5 years ago

            CRTs are faster than LCDs, but in the end of the day. Input lag is mostly a problem on the user end unless you got significant hardware issues.

            • dragontamer5788
            • 5 years ago

            [url<]http://youtu.be/5r6_XNELwFE?t=1m15s[/url<] Oh yeah, the limits are there. But its way faster than 60Hz. I play against players who can pull this sort of stuff off consistently.

      • puppetworx
      • 5 years ago

      Richard Huddy [url=http://youtu.be/G7-jn07jy8I?t=5m9s<]claimed at CES[/url<] that FreeSync [i<]should[/i<] have lower latency to G-Sync since FreeSync does not have to sync between GPU and monitor every frame like G-Sync does, it simply 'pushes' frames. However, he also said they haven't measured the lag difference yet, though they will soon. A few months ago TFT Central [url=http://www.tftcentral.co.uk/reviews/content/asus_rog_swift_pg278q.htm#lag<]tested the ROG Swift[/url<] and concluded it had extremely low input lag (~4ms). So FreeSync can't really be [i<]significantly[/i<] faster than that but if that kind of input lag carries across the whole range of FreeSync monitors it's a very good thing, let's hope. [b<]Update:[/b<] It just struck me that the TFTCentral review doesn't state whether or not they were using G-Sync for the input lag test, the structure of the article and context suggests not. Handily they do mention an input lag test performed by [url=http://www.blurbusters.com/gsync/preview2/<]BlurBusters[/url<] and it is a [i<]very[/i<] interesting read because of their input lag testing methodology (really simple and very cheap). Also because they found a problem with G-Sync that introduces a lot of lag (20ms) while playing at very high FPS. It seems like G-Sync was introducing a few ms of lag otherwise, compared to V-Sync off - nothing significant. I highly recommend every gamer gives that article a read.

        • Flapdrol
        • 5 years ago

        Can’t go faster than the monitor, if that happens you have to wait before you can refresh, which means you build up latency like with vsync. With freesync you can choose to let it tear instead, but in both cases it’s better to use an ingame fps limiter a bit below the maximum refresh.

    • jts888
    • 5 years ago

    This title is slightly confusing IMO.

    Should be something like “40-60 Hz FreeSync vs. traditional V-sync/tearing difference captured in 10x slo-mo”, since the output clip frame rate can’t be pre-assumed.

    I thought for a second that somebody was demonstrating a 240 Hz Adaptive-sync TN panel and got disappointed.

      • xeridea
      • 5 years ago

      I thought that at first to, to be fair, it does say “240FPS footage”, footage being the keyword. I was initially thinking if you were really running 240FPS would FreeSync even be necessary.

    • EndlessWaves
    • 5 years ago

    So what exactly will the freesync certification consist of and is it worth looking for over the standard adaptive sync feature listing?

      • derFunkenstein
      • 5 years ago

      It reads like FreeSync will only work with whitelisted monitors. I wouldn’t look at other adaptive sync monitors because I’d be concerned that the GPU wouldn’t use the functionality.

        • jts888
        • 5 years ago

        Ryan Shrout from PCPer confirmed with an AMD rep that there will be no whitelisting and that an Adaptive-sync monitor will be supported regardless of FreeSync branding deals:

        [url<]http://www.pcper.com/news/Displays/CES-2015-ASUS-MG279Q-27-2560x1440-IPS-120-Hz-Variable-Refresh-Monitor[/url<]

          • derFunkenstein
          • 5 years ago

          well that’s good, they’re doing the right thing.

      • TheMonkeyKing
      • 5 years ago

      Since the certification is free (assuming, the brand sends a model over to the AMD camp for a looksee), my assumption is that all brands will sends one to AMD that meets their minimum FreeSync requirements. Since they don’t have to build to suit AMD, the certification is just an extra badge to put on the box and marketing material. 1st gen monitors might hike their price just enough to fall below hardware mandated G-sync price points to acknowledge differentiation from non-freesync and non g-sync monitors but just enough to raise their profits in the short term.

      2nd generation and beyond will be cheaper as GPU hardware becomes better known and more sophisticated. Of course this assumes that 1st gen will be something the end-users identify with needing as opposed to wanting.

      As for the non-certified monitors, people in the know who frequent forums like this or AV geeks, etc. will rely on these folks to test the monitors for Freesync capabilities. These other monitors may end up like the Korean monitors we see on eBay that our group buys instead of shelling out the big bucks for the well known name brands.

        • jts888
        • 5 years ago

        It looks like at least some companies are demoing Adaptive-sync monitors while avoiding explicit AMD/FreeSync partnerships and branding:

        [url<]http://www.pcper.com/news/Displays/CES-2015-ASUS-MG279Q-27-2560x1440-IPS-120-Hz-Variable-Refresh-Monitor[/url<] not mentioned here: [url<]http://www.amd.com/en-us/press-releases/Pages/amd-and-technology-2015jan05.aspx[/url<] This feels like some G-sync monitor vendors not wanting to piss of Nvidia or devalue perception of their existing product stock.

          • Ninjitsu
          • 5 years ago

          Or, keeping the door open for Nvidia to support Adaptive Sync without having to appear as having to admit “defeat” to FreeSync.

        • EndlessWaves
        • 5 years ago

        As AMD are dictating the terms of the certification the monitor manufacturers do have to build to suit AMD. If they disagree with AMD’s criteria then it’s entirely possible we’ll see non-FreeSync-certified adaptive sync monitors even with a no cost certification.

        Monitors equipped with DisplayPort are already premium products, so I wouldn’t expect the price to rise too much.

          • the
          • 5 years ago

          Kinda. VGA is finally, finally starting to disappear from monitors. Recent Dell units that I have implement DisplayPort instead of that old analog standard. As a bonus, they also have a built-in MST hub for daisy changing. HDMI is also found on these units instead of DVI.

          I have the feeling that the resolution race on the high end will eventually force the low end to accept DisplayPort with any sort of premium.

    • MadManOriginal
    • 5 years ago

    An AMD standard that will be widely adopted?? I guess it happens about twice a decade.

    • jthh
    • 5 years ago

    ..and here I am with a R9 280x. Made the wrong choice again. 🙁

      • HisDivineOrder
      • 5 years ago

      Cue the “You just lost” Price is Right music with the horns.

        • Prestige Worldwide
        • 5 years ago

        [url<]http://www.sadtuba.com/[/url<]

      • auxy
      • 5 years ago

      Maybe I’m tired and that’s making me emotional (or maybe it’s hormones, stupid femininity), but the second half of your comment just kills me; brings me to literal tears.

      I know that feeling all too well of having picked the wrong side, having been in the wrong, and it’s the worst feeling in the universe. The very first case; I remember when I was little, cheering on SEGA because my older brother had done so before (in the 16- and 32-bit eras), all the while looking on with a bitter forlornness as the PS2 and Gamecube came out and utterly trashed the Dreamcast. I switched sides later, but that feeling, that awful misery of having to say “All I have is a Dreamcast…”

      Well, “I have a Radeon” sort of evokes that same feeling sometimes.

        • anotherengineer
        • 5 years ago

        It’s OK.

        I think he meant he chose the wrong Radeon, since it is last gen and does not support adaptive/free sync.

        I’m happy with my 6850 radeon 🙂 was only $135, and sold the game it came with for $15, and the 4850 it replaced for $40, going on 2 + years now 🙂

        Edit – hormones can suck too!!

    • Milo Burke
    • 5 years ago

    I can’t wait for a TR article on Freesync! Even if it’s just a short one. But a shootout would be preferable.

    • CampinCarl
    • 5 years ago

    So, someone may have answered this question previously…but what’s the actual difference between FreeSync and Displayport Adaptive Sync? Is FreeSync simply some extra driver stuff from AMD on top of Adaptive Sync, or what?

    Edit: Nevermind, read the FAQ. My intuition was somewhat correct. Adaptive Sync is the hardware spec upon which FreeSync builds its software package.

      • EndlessWaves
      • 5 years ago

      Freesync seems to be like Eyefinity in that it’s AMD’s label for several related things.

      So at the very least it’s both the driver update to support adaptive sync and a certification project certifying… something, about adaptive sync monitors.

    • Meadows
    • 5 years ago

    Well, it works as advertised then. Without FreeSync, the frames visibly update 3 paces at a time with 1 stall afterwards (not to mention the tearing), with it the motion is constant.

    Now the question is how “Free” this Sync really is, because if so, then NVidia are going to have to hurry up with a driver update rather soon.

      • jts888
      • 5 years ago

      The Adaptive-sync component cost delta should be in low single digit dollars, and announced display prices don’t appear to be gouging customers at all surprisingly.

      The G-sync FPGA scaler board is probably about $120-$150 in parts alone, but there really haven’t been any substantial logic bitfile updates for it since it came out, except allowing toggling between strobing and variable-sync modes.

      It’s pretty questionable whether Nvidia can add processing logic that’s customer perceptible enough to warrant the substantial added cost.

      • Sargent Duck
      • 5 years ago

      It’s pretty free as the technology is built into Display Port 1.2a. Nvidia can use it as well, they just chose to go after g-sync instead.

        • jts888
        • 5 years ago

        DP 1.2a wasn’t even dreamed up until after G-sync was announced, it’s just that Nvidia won’t give up on their sunk costs or allow a currently AMD GPU exclusive feature to be seen as better or at least a better value.

        • Meadows
        • 5 years ago

        They didn’t go anywhere. It’s a matter of a driver update.

          • jts888
          • 5 years ago

          Yeah, but are they going to release an FPGA bitfile to make G-sync monitors emulate $200-less-expensive, AMD-GPU-compatible monitors?

          Nvidia is pretty stuck here and is probably foaming at the mouth that AMD might have pulled this coup off. I just hope OpenGL/DirectX can similarly supercede Mantle so we can get past this spate of unnecessarily propriety standards.

            • Meadows
            • 5 years ago

            Why should the monitors emulate anything? The monitors are what they are. If someone purchased them, and they work fine (which they should), then all is well. The monitors will not stop working just because an alternative may come, and I’m sure the existing users will get their money’s worth over the years.

            • Klimax
            • 5 years ago

            [url<]https://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo[/url<] [quote<]However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand. That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction. When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That's why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists. [/quote<] This should correct your very wrong understanding of issues.

            • pdegan2814
            • 5 years ago

            The issue won’t be getting G-Sync monitors to be compatible with FreeSync video cards. Nvidia won’t care a bit about that. The question will be whether or not Nvidia feels pressure to add support for AdaptiveSync monitors to their GPUs, and whether or not it can be done with existing GPUs via a driver update, or if it will require hardware changes. At the end of the day, assuming there’s no significant difference in how the two technologies perform, it’s going to come down to price and availability. If there are many more options for AdaptiveSync monitors out there and they’re significantly cheaper than their G-Sync equivalents, Nvidia will be under a lot of pressure.

            • puppetworx
            • 5 years ago

            If the hardware of the GTX 900 series can’t support DP 1.2a (although that functionality may be disabled until a future ‘update’) I’d be very surprised.

            The question for me is when will they pull the update trigger? If it were me I’d want to pull the rug out from under AMD on the day they launch to maximize coverage “Surprise folks! We’re supporting this too from day one… and it works across the [i<]entire[/i<] GTX 900 line!" It would steal a lot of attention from AMD and completely remove the need to go with a Radeon.

      • superjawes
      • 5 years ago

      I don’t think this confirms that FreeSync “works as advertised” just yet. This demo isn’t new. We saw it some time ago, and it’s a very static, predictable animation. For FreeSync to really work “as advertised,” it’s going to have to show off in an actual game where the time between frames can fluctuate wildly. We also need to see if there’s any extra overhead and/or lag, and how it compares to G-Sync.

      All that being said, I am glad that we should be seeing real FreeSync monitors in the wild soon, and if it ticks all the boxes without bringing in some extra cost, it will be wildly popular.

        • Meadows
        • 5 years ago

        You can see the pacing of the frames if you watch closely enough. It’s not a simple “vsync: on” setting, the frames come evenly separated in time. That’s enough confirmation for me for now.

      • UnfriendlyFire
      • 5 years ago

      It’s part of the new standard. Not many monitor manufacturers are willing to defy industry port standards for obvious reasons.

      (I remember there was an old port design where the physical dimensions were so loosely defined that a connector was not guaranteed to be physically compatible with a port. I think it was RS-232.)

    • chuckula
    • 5 years ago

    I know a quick video can’t show it all, but how well does that freesync implementation handle frame time jitter from your experience?

    I can already turn on V-sync to stop the tearing, but then you pay the price with jitter/stutter in the frame rate. Did freesync take care of that pretty well?

      • Damage
      • 5 years ago

      Yes. Like G-Sync. Animation looks smooth even at 35-45 FPS.

        • chuckula
        • 5 years ago

        Thanks!

        • nanoflower
        • 5 years ago

        Just so long as the frame rate doesn’t drop below 30FPS.

          • Meadows
          • 5 years ago

          No, 42. Haven’t you been paying attention to news?

            • Pwnstar
            • 5 years ago

            Nano’s point is correct, Adaptive Sync turns off under 30FPS.

Pin It on Pinterest

Share This