FreeSync monitors hit mass production, coming in Jan-Feb

Among other things, AMD’s Catalyst Omega driver announcement has given us an update of sorts on the status of FreeSync. AMD told us in August that the first FreeSync panels would be out in early 2015, and the company looks set to keep that schedule.

FreeSync displays have entered the mass production and validation stage, AMD tells us. The first commercial offerings are now scheduled for a launch in the January-February time frame. We’ll still have to wait until March for Samsung’s FreeSync-enabled UHD monitors, though.

AMD says that, from the point of view of display makers, there’s “not much you need to do” to support FreeSync other than comply with the DisplayPort 1.2a standard’s Adaptive-Sync specification. Nonetheless, the chipmaker mentioned some “secret sauce” that we’ll learn more about in January or February. Hmm.

I’m guessing we’ll have at least some fresh details early next month, since AMD and its partners plan to have FreeSync monitors on display at CES. Scott will be at the show, so stay tuned.

Comments closed
    • ermo
    • 5 years ago

    As someone who keeps an eye out for tech with potential but leaves the early adoption to others, my only question is:

    I currently have a couple of HD7970s and 3x1080p 60Hz IPS monitors. Can anyone offer some informed speculation based on industry experience on just how long it will take before someone like me can go out and buy one or two new graphics cards and 3 new IPS/VA mainstream monitor(s) (e.g. ‘non-gamer/non-10bit’) where both the hardware and the software and the integration is considered mature?

    We’re talking at least 12-18 months, aren’t we?

      • ferdinandh
      • 5 years ago

      I would wait for a 40inch 4K with free-sync and IPS/VA. Maybe even a 120Hz/144Hz version. You go from 3 monitors to one but have more screen area and more pixels.

    • cynan
    • 5 years ago

    All well and good for the monitors. Now when will we see televisions with FreeSync? (I suppose we’d need to see them with DP first…)

      • Meadows
      • 5 years ago

      Why do you want a telly with this?

        • meerkt
        • 5 years ago

        1. To use it for games.
        2. To use it for movies, and without ReClock-type shenanigans.

          • superjawes
          • 5 years ago

          Yup. Movies are 24 FPS and most television (IIRC) is 30 FPS. If you want 48 FPS film, you have to go all the way up to 240 to get an even ratio between both of those frame rates. Thing is, we don’t NEED fixed frame rates with LCDs. Let the device refresh the display on the fly and the mismatch goes away.

          Oh, and games…just because I don’t play on consoles (often) doesn’t mean that I don’t want my console friends to enjoy better animation quality.

          And:
          3. So we can get adaptive refreshing when connecting a TV to our PCs 😉

        • Farting Bob
        • 5 years ago

        24 (and future 48) fps movies dont match up with 60hz TV’s perfectly. Freesync could make that happen.

          • Palek
          • 5 years ago

          I’m fairly certain many TVs adaptively change their refresh rates already to match the video source, e.g. they use 48Hz for 24Hz sources, which is a much better match. The panels used in TVs typically have a range of frequencies they are happy to work with, such as 48 ~ 60Hz. Some Sony TVs for example happily switch between 50Hz for PAL sources, 60Hz for NTSC, and 48Hz for 24Hz sources.

            • JustAnEngineer
            • 5 years ago

            My Bravia XBR-55HX929 does this by running at 240 Hz (which Sony calls “Motionflow XR960”) to avoid ugly telecine judder problems.

            • meerkt
            • 5 years ago

            Are you sure it’s 240Hz? I think there aren’t higher than 120Hz panels (or 144Hz, at least in computer monitors). You can find the panel model number in the service menus/mode.

            • Prestige Worldwide
            • 5 years ago

            It’s just interpolation. It doesn’t accept an input higher than 60hz as per HDMI spec in most TVs, and then in between each 2 real frames received in the input singnal, it inserts 3 extra interpolated frames processed by its scaler to artificially “smooth” the animation.

            • meerkt
            • 5 years ago

            Yes, but it’s not accurate, at least not for PC sources. My TV’s 23.976Hz mode is in fact 23.972Hz, so there’s a one frame mismatch something like every few minutes.

      • SpoCk0nd0pe
      • 5 years ago

      I sincerely hope for this too. Since I’m sitting 2,5 meters away from my monitor, I’d really appreciate a 1080p 40″ freesync solution. Preferably with low persistence and decent color accuracy/black levels at a price tag below 500€.

    • kilkennycat
    • 5 years ago

    Prototype implementation of G-sync were available for 3rd-party evaluation months before the production models.

    Scott, when are you going to have a FreeSync prototype monitor (and compatible AMD GPU hardware) for side-by-side evaluation with G-sync? “FreeSync panels would be out in early 2015” implies present availability of beta-phase prototype hardware.

    AMD (Marketing) always seems to be great in “talking the talk”, far less so in “walking the walk”.

    • Firestarter
    • 5 years ago

    I just want to see how it compares to G-Sync

    • anotherengineer
    • 5 years ago

    It would be really nice if a monitor firmware upgrade would/could provide DP 1.2 with the ability to support DP 1.2a or 1.3 to support this.

    • The Egg
    • 5 years ago

    A free solution would be best for all, but I remain cautiously optimistic. I have to wonder why NVidia would design an expensive logic board with a dedicated processor and RAM if it could be done just as well for free. Sure they’d love to corner the market with their own proprietary solution, but if that were their only intention, you would expect them to go with a much simpler design. They could easily achieve the same goal with a $10 proprietary card, and move substantially more G-Sync monitors, thus bringing them closer to their goal.

    So why didn’t they?

      • superjawes
      • 5 years ago

      Again, this G-Sync solution is still using an FPGA, which means that Nvidia can still change the design if they need to. That’s where most of the cost lies. One design is complete, the per-unit cost WILL come down (at least from a BOM perspective).

      But that being said, I’ve been wondering the same thing. It does seem awfully strange that anyone would implement their own solution if the controls already existed within the current spec. I don’t think Nvidia are a bunch of fools, which means that there is a reason for this particular solution, and I’d like to find out what that is…

        • The Egg
        • 5 years ago

        Even after they go to ASIC, it’s still a relatively expensive solution, with chip design and fab costs, plus around 768MB DDR3 on each board. If proprietary control was their only intention, they could’ve gone with something much less expensive.

          • Airmantharp
          • 5 years ago

          We don’t actually know if it’ll be any more expensive, because we don’t know if that RAM is actually needed for G-Sync operation or just for the FPGA to do it’s thing.

          • psuedonymous
          • 5 years ago

          The big RAM bank is for bandwidth, not capacity. The FPGA used does not have a proper memory controller, and can’t implement one (at least not without using an even gruntier FPGA). So a lot of chips are run in parallel at well below their rated speed, because it’s cheaper than the the bigger FPGA that would be needed to implement a proper memory controller.

          The actual work that needs to be done by the display controller after receiving the frame (i.e. everything after the DVI/DP signal is received and decoded to a raw pixel stream) is effectively identical between G-Sync and DP Adaptive Sync.

      • tuxroller
      • 5 years ago

      Lock in, imho. Physx and cuda come to mind.

      • Prestige Worldwide
      • 5 years ago

      I think we will see that GSync is the better technology in the end. FreeSync is just another “Me too” reaction from AMD after Nvidia introduced an innovative feature outside of GPU hardware.

      Other recent examples would include NV Shadowplay / AMD Gaming DVR, Dynamic Super Resolution / Virtual Super Resolution, frame-pacing embedded in their hardware since the G80 / software frame pacing fix in Tahiti and finally hardware frame pacing in Hawaii.

      Nvidia is consistently proactively improving the overall experience for their customers and adding new useful features to their hardware and software, AMD is often just copying them reactively.

      As to why didn’t they, well, in the end a hardware solution will probably achieve the best results, and AMD was caught with their pants down again and scrambled to find an easy alternative to implement without as much R&D.

      This is just my speculative take, I look forward to Tech Report doing a thorough comparative analysis.

      I currently have a BenQ XL2420T 120 Hz TN, which has been great for multiplayer gaming. Sure, the colours and viewing angles are not the best, but when your objective is performing to the highest level you can online, those things become secondary to gameplay.

      However, I would REALLY like to eventually have a 27″ IPS 1440p 144hz GSync or Freesync monitor, whichever ends up to be the best technology. To game with the best fluidity and colour without compromise would be awesome.

    • aspect
    • 5 years ago

    Any plans to test how much an increase in latency freesync introduces?

      • Terra_Nocuus
      • 5 years ago

      I’m betting there will be in-depth analysis once units are available. I’m hopeful for DisplayPort Adaptive Sync / FreeSync, but I won’t be terribly surprised if G-Sync works a little bit better.

        • superjawes
        • 5 years ago

        There’s also going to be some weirdness on that comparison, as all G-Sync monitors are 144 Hz (at least so far), and I think these Adaptive Sync ones will be 60 Hz. On the other hand, as long as FreeSync’s overhead is minimal to non-existent (and assuming the animation is still improved), then it will basically win this “format war”.

        Nvidia is pushing the tech to the limit, which might result in a better product overall, but having the tech work in more common displays could mean that the tech comes to IPS sooner rather than later, and I suspect that gamers would prefer a 60 Hz IPS FreeSync display to a 144 Hz TN G-Sync one.

          • Terra_Nocuus
          • 5 years ago

          I think G-Sync can run in 60Hz mode (IIRC), so it might be tangerines-to-oranges.

          • A_Pickle
          • 5 years ago

          As a gamer, I will prefer whichever is cheaper — which is almost always the open, industry-standard rather than proprietary option.

          • tuxroller
          • 5 years ago

          I’m not sure that this is true.
          IIRC, freesync covers a larger range of refresh rates, according to amd’s faq, than gsync.
          For 60Hz monitors adaptive sync allows vsync to go as low as 7hz ,iirc. It also supports video refresh modes. Aiui, gsync is only concerned with gaming.

          Other than that, I’d expect them to perform largely similarly.

            • superjawes
            • 5 years ago

            Well that actually makes them diverge a bit. I’m only assuming 60 Hz because I really haven’t seen any indication on what these monitors will be like. If they had the “gaming” keyword, I would assume 120-144 Hz TN displays.

            But the thing is, G-Sync is being made to push the bar upwards, which is why we haven’t seen any non-144 Hz G-Sync displays (unless they were 4k), and G-Sync does not handle <30 FPS rates elegantly ([url=https://techreport.com/news/27449/g-sync-monitors-flicker-in-some-games-and-here-why<]see here[/url<]). If FreeSync works below 30FPS, then it will definitely have an edge on the lower end of frame rates. The head-to-head comparison will definitely be useful. I just think that it will be interesting, and certain aspects aren't going to line up perfectly.

            • tuxroller
            • 5 years ago

            I’m not really sure what you’re trying to say.
            High refresh rate monitors existed before gsync.
            Iirc (since the page is down I can’t reference amd at the moment and vesa charges for copies of their standards) the refresh rates went from 7-240. That’s not a single range but what seems to be provided by the spec. It APPEARS that any given monitor is limited to the frequencies it can support to some rest relatively low number (a array of 10, or so, by my guess). So the monitors that can go down to 7Hz won’t be going to 144Hz.

          • Laykun
          • 5 years ago

          Freesync seems aimed at gamers primarily, I don’t see why we wouldn’t see adaptive sync monitors with 144hz panels. At least I haven’t seen any numbers to suggest there would be any kind of technical limitation around the refresh rate.

            • superjawes
            • 5 years ago

            The technical limitations is something I suspect. As I’ve said (somewhere in this discussion), Nvidia are not a bunch of fools, so there was definitely a reason why they implemented something custom as opposed to using something that would have been much quicker and easier to implement (via this DP 1.3 spec). The problem is that I don’t know what that reason is. It could just be to bring other Nvidia technologies to displays (like LightBoost). On the other hand, it could also be Nvidia’s way of overcoming technical limitations inherent in the spec.

            It’s a major reason why these Adaptive Sync monitors need to hit shelves ASAP so we can actually get the head-to-head comparison as opposed to speculation which mostly comes from AMD marketing.

            • Laykun
            • 5 years ago

            I almost feel like gsync exists to give the industry a kick in the pants so they get their act together on a standard like this, much like AMD’s Mantle has challenged conventional graphics APIs. I doubt nvidia will want to continually invest in what essentially will become a dead end product (assuming the open standard and it’s implementation are at parity or better than gsync). I think if nvidia were serious about gsync they’d have subsidised its price or moved to custom silicon instead of FPGA to make it a more attractive option to video card buyers and lock them into the nvidia eco-system.

          • _ppi
          • 5 years ago

          Having G-Sync promoted on 144 Hz panels is weirdness anyway. Not that it’s useless, but the impact on 60 Hz panel is much greater than on 144 Hz one.

          Case example:
          60 Hz panel: if a frame is drawn at 59fps (recalc to ms time, if you wish), then the refresh rate (with VSync enabled) falls down right to 30 fps.
          120 Hz panel: if a frame is drawn at 59fps, then the refresh rate is 45 fps, i.e. 50% faster than the 60 Hz panel
          Adaptive sync 60 Hz panel: Such 59 fps frame would be rendered at 59 fps

          And to add to the injury, the adaptive sync techs help most at LOW refresh rates, i.e. when you are in 20ies or 30ies. I know some people argue to the death that 120 Hz is waaay better than 60 Hz, but the most substantial benefit is for those of us, who are happy to consistently get above 30 fps with reasonable details.

          Same matter as with Mantle – helps most with slow, multicore CPUs (i.e. what AMD produces by a “chance”)

            • Voldenuit
            • 5 years ago

            144 Hz is great for movies.

            23.976 fps * 6 = 143.86 fps.

            • Firestarter
            • 5 years ago

            adaptive sync helps most at low frame/refreshrates, but tearing still exists at 100+fps/hz even when it’s not as bad as with slower displays. That said, with low fps but a fast display, tearing and stuttering are pretty much a non-issue

    • Hattig
    • 5 years ago

    Over time, the additional cost of providing Adaptive Sync / FreeSync in a monitor will be zero. It is in fact an enhancement of a feature already used in eDP for power saving, where the refresh rate can be dropped.

    Nvidia’s solution, besides being proprietary, is adding $100 to the cost of a monitor currently. Nvidia are using GSync as a way to gain marketshare in the LCD controller market.

    If Nvidia do say “GSync In == Adaptive Sync Out” then that’s nothing to celebrate. But no matter, I won’t be buying those monitors anyway.

      • superjawes
      • 5 years ago

      True, the cost off adding Adaptive Sync/FreeSync to monitors will be zero–from a BOM perspective. But don’t act surprised when monitor manufacturers tack on a premium for such displays.

      I also want to point out that the BOM cost of G-Sync displays will also approach zero. That $100 addition comes from an FPGA, which runs for about $100. Once that gets replaced with an ASIC, the per-unit cost basically melts away. I do expect premiums to be added to sale prices, but this BOM price reduction is nothing new.

        • willmore
        • 5 years ago

        The Gsync licensing fee doesn’t melt away, does it?

          • superjawes
          • 5 years ago

          1. Do you have any idea what that fee will look like? Any source to indicate what it is?

          2. It would be uncessecarry. As long as G-Sync only works with Nvidia GPUs, Nvidia claims their “licensing fees” by selling more GPUs.

            • willmore
            • 5 years ago

            No, I don’t have any idea. Should I? G-sync is a propriatary and licensed scheme that nVidia controls. If there is no monitary fee, then I am curious to know what restrictions or limitations are part of getting a license from nVidia.

            Your second point is speculation.

            • superjawes
            • 5 years ago

            My second point is fact. G-Sync does not work with AMD GPUs.

            And also, yes, you should have an idea of what the licensing scheme is for G-Sync if you are going to make assertions about it. To date, the technology does NOT look like a licensed product. The scaler logic is being put on a module made by Nvidia for each individual display. That isn’t a licensed agreement. It’s use of Nvidia parts for the implementation.

          • Terra_Nocuus
          • 5 years ago

          No, but your screen tearing problems will 🙂

    • Tristan
    • 5 years ago

    Does not matter what AMD says. They said: validation in October, first samples in November, first sales in December. Now their plans are delayed by quarter. And who really want to manufacture these displays ? For now Samsung, because they are in patent war with NV. Anybody else like Dell, Asus, Benq ? Total silence on FreeSync. Instead they implement G-Sync, and charge money on sales. NV very like exclusivity, and there may be deals like ‘G-Sync in = FreeSync out’.

    • Krogoth
    • 5 years ago

    It will take until Q3-Q4 2015 for monitors to start coming with Freesync by default.

    • odizzido
    • 5 years ago

    This is great. I just hope nvidia supports it.

      • dragontamer5788
      • 5 years ago

      I’ve said it before, and I’ll say it again.

      FreeSync is not a top-down implementation controlled by a single interested party. FreeSync is a standard… a standard that will be implemented differently by several different companies at varying quality levels.

      My [b<]primary[/b<] hope is that FreeSync will be implemented by someone correctly near its launch. Or, if FreeSync monitors are poorly implemented, then the general community won't "poison the well".

        • riflen
        • 5 years ago

        Well then you’ve been incorrect before and you’re incorrect again. FreeSync is an AMD exclusive feature that requires AMD GPUs and specific monitor hardware. FreeSync is not a standard. It’s a vendor product feature based on an optional feature of the DisplayPort standard. It’s all here in the [url=http://support.amd.com/en-us/kb-articles/Pages/freesync-faq.aspx<]FreeSync FAQ[/url<]. There's so much misinformation on this topic that it becomes almost impossible to have an accurate discussion on the subject online. When you say FreeSync in your post, you mean DisplayPort Adaptive-Sync.

          • superjawes
          • 5 years ago

          I’m not sure AMD is intentionally doing it, but they are certainly confusing the terms “Adaptive Sync” and “FreeSync”.

            • Pwnstar
            • 5 years ago

            Only for dumb people.

            • Ninjitsu
            • 5 years ago

            Dumb people are confused anyway.

          • dragontamer5788
          • 5 years ago

          [quote<]FreeSync is an AMD exclusive feature that requires AMD GPUs and specific monitor hardware.[/quote<] And that specific monitor hardware is being implemented independently by three different scaler companies (MStar, Novatek and Realtek), which is going to be implemented by countless monitor companies. My point stands. The quality of FreeSync monitors is going to vary dramatically. AMD cannot control what monitor makers are going to do. NVidia's GSync solution can, but at a cost to the consumer. Its the nature on how these things are set up.

            • Pwnstar
            • 5 years ago

            That monitor hardware is for Adaptive Sync. It is GPU agnostic. It becomes FreeSync if you attach it to an AMD GPU.

        • WhatMeWorry
        • 5 years ago

        You can say that about all Standards.

          • dragontamer5788
          • 5 years ago

          Yes… I can. I thought I was stating the obvious. Maybe its too obvious?

          Just don’t go out and buy the first FreeSync monitor. Wait back, check the reviews, do your homework. I’ve got an R9 290x and am hugely supportive of AMD’s FreeSync efforts. I’m eagerly awaiting a FreeSync monitor for my old ~5 year old monitor.

          But I know how standards work, I know how the industry messes things up. I’m not going to be first in line… I’m going to wait for the reviews to come in and tell me which monitors are bad with FreeSync, and which ones are good.

      • HisDivineOrder
      • 5 years ago

      I expect since it’s part of the standard of DisplayPort 1.3, they will. Just not ASAP. When they finally get around to DP 1.3, I expect the support will be there along with the rest of the standard. At the same time, they’ll claim that G-Sync and Adaptive-Sync (no relation to Adaptive V-Sync even though the DisplayPort committee did the world no favors when it made the name so close to the much older nVidia tech name) can coexist because G-Sync is superior in [insert marketing jargon] ways.

      nVidia has shown itself not above supporting standards when they are actually presented to standards bodies. They just don’t support so-called standards that AMD calls standards without actually them actually going to the actual trouble of making them into actual standards.

      I’m thinking of Mantle in this regard along with their PhysX alternative(s) and their original take on a proprietary CUDA alternative.

      In fact, given AMD’s focus on Mantle, nVidia has refreshingly stayed the course with DirectX and OpenGL. You know? The standards the industry have long relied on.

      Imagine the chaos if nVidia had rechristened Glide. Open standards are better than closed standards and standards in general are better than proprietary nonsense called a standard. And everything nVidia does seems to back that up, from the limited success of PhysX to their rousing success with DirectX optimizations that mostly match the lone advantage (ie., supposedly performance) of the failure that is Mantle.

      So yeah. Since AMD got Freesync into DisplayPort 1.3 under another name (not coincidentally very unlike the naming of another nVidia technology, probably a jab), I expect nVidia to support the whole DP standard in due course.

      Don’t expect them to be in a rush to do so. I expect the bigger compatibility nightmare will be all the G-Sync monitors currently in production probably not supporting DP 1.2. Hell, I imagine quite a few DP 1.3 monitors won’t support Freesync…er… I mean, Adaptive-Sync either.

        • dragontamer5788
        • 5 years ago

        [quote<]nVidia has shown itself not above supporting standards when they are actually presented to standards bodies.[/quote<] NVidia is almost 4 years late on supporting OpenCL 1.2, and doesn't seem to have any plans on supporting OpenCL 2.0 at all. In contrast, AMD currently has support for OpenCL 2.0. All companies, when they are the "underdog" will support the standard. Nook, Kobo, Google support epub. Amazon makes Kindle proprietary due to their market share advantage. Microsoft only cared about HTML5 and Web Standards when they lost their IE monopoly. Similarly, if NVidia percieves GSync to be an advantage, they will hold onto it with a death-grip, much like how NVidia refuses to support any feature newer than 2011 for OpenCL. The only reason a "superior" company would want to embrace a technology, is to destroy it from within. [url=http://en.wikipedia.org/wiki/Embrace,_extend_and_extinguish<]Embrace, Extend, Extinguish[/url<] is the oldest trick in the book for Tech companies. If anyone wants to write cross-platform OpenCL code, they are forced to three-year old APIs at best because of NVidia's strategy. We can probably expect NVidia to also throw a monkey wrench into Adaptive Sync. Its the problems with an open standard, no one really has to support it 100%... and NVidia can control the propaganda wars by making a shoddy implementation if they wished. Or they can implement GSync specific features in the DP standard, making monitors proprietary anyway. Basically, its a format war. DVD+R vs DVD-R, Betamax vs VCR, BluRay vs HD-DVD.

        • psuedonymous
        • 5 years ago

        [quote<]I expect since it's part of the standard of DisplayPort 1.3[/quote<]It's an [i<]optional[/i<] portion of the standard (or rather, DP Adaptive Sync is, with Freesync being AMD's source-end branded implementation), so nobody in under any obligation to support it. The big problem with Freesync is that AMD offers no reference implementation monitor-side. G-Sync has an FPGA validated design that display controller manufacturers can license to implement in their ASICs. For Freesync, they either need to buy G-sync anyway to get the functional blocks needed to do high-speed asynchronous panel refresh and self-refresh, or design their own from scratch.

          • dragontamer5788
          • 5 years ago

          Thats why AMD announced [url=http://www.amd.com/en-us/press-releases/Pages/support-for-freesync-2014sep18.aspx<]this[/url<]. It looks like major scalers have implemented adaptive sync into the ASICs

      • Bensam123
      • 5 years ago

      Unlike GSync for AMD, Nvidia actually has a option to support this…

    • DrCR
    • 5 years ago

    AMD have a leg up on Nvidia when it comes to such ‘shuddering’ issues to begin with — They make use of a catalyst to lower the energy of activation.

Pin It on Pinterest

Share This