FreeSync 2 simplifies life with brighter, more colorful displays

The open nature and flexible implementation specifications of AMD's FreeSync variable-refresh-rate technology have doubtless contributed to its wide adoption by monitor manufacturers. At its summit in Sonoma, California, last month, AMD boasted that it now has 121 FreeSync displays from 20 display partners, and it's quick to press that numeric advantage over the claimed eight partners and 18 displays it says that Nvidia has netted with G-Sync.

Not every one of those 121 FreeSync displays are a gamer's dream, though. Refresh rate ranges aren't often clearly communicated by manufacturers, and support for desirable features like Low Framerate Compensation among FreeSync displays can still require a deep dive into AMD's own FreeSync monitor list.

The challenge of figuring out exactly what one is getting from a given monitor is about to become more and more difficult, too. We're already seeing displays with a wide range of specs that claim HDR support, and unless you're already well-versed in video, professional graphics, or photography, it can be hard to keep track of the wide range of color gamut standards that displays can reproduce. As HDR support and wide color gamuts begin to make their way into games, however, it'll doubtless become tricky to tick all the check boxes needed to faithfully reproduce that content.

Today, AMD is trying to make life with those increasingly bright and colorful displays easier. Its FreeSync 2 initiative, debuting this morning, addresses several of the issues with displaying the "deep pixels" that the company has been evangelizing for some time. Just like FreeSync does for refresh rates, FreeSync 2-compatible displays will communicate critical information about their color-reproduction capabilities and dynamic range to the graphics driver.

AMD says the major benefit of this information-sharing is reduced input lag, since the graphics card and game engine can target the particular color space and tone-mapping of a given display directly. Without FreeSync 2, additional tone-mapping steps would have to be performed in the display chain to match the output characteristics of a given screen, increasing lag.

That communication also lets the graphics card automatically switch between color spaces and dynamic ranges as needed. For example, moving from an sRGB desktop to a HDR10 movie or game and back should happen seamlessly.

To meet those high standards for ease of use, dynamic range, color reproduction, and input lag, AMD says FreeSync 2 monitors will have to support HDR and wide color gamuts by default, and they'll have to undergo a strict certification process. FreeSync 2 displays will also be required to have Low Framerate Compensation capability. Because of those stricter implementation requirements, FreeSync 2 won't replace the original FreeSync.

Once FreeSync 2 displays begin arriving, FreeSync-compatible Radeon hardware that was already capable of working with wide color gamuts and HDR content should hook right up to FreeSync 2 displays with the proper drivers installed.

AMD didn't announce any FreeSync 2 displays at its summit last month, but we imagine we'll see or hear of at least one compatible display on the CES show floor. Stay tuned as we keep an eye out for these appealing-sounding displays during our trek through Vegas this week.

Comments closed
    • Andrew Lauritzen
    • 3 years ago

    This is potentially cool, but there are *many* reasons why calling this “FreeSync 2” is a terrible idea…

      • Voldenuit
      • 3 years ago

      [quote<]This is potentially cool, but there are *many* reasons why calling this "FreeSync 2" is a terrible idea...[/quote<] Decent 2: Freesync. Also, Freesync 2: The Syncening.

      • LostCat
      • 3 years ago

      While it does sound terrible, it is still a set of features on top of FreeSync…so…meh. I don’t know what else they could call it.

    • Chrispy_
    • 3 years ago

    I have [i<]soooo[/i<] many doubts about HDR and wide gamuts. It's 2017 and there are some serious issues just getting the 20-year old LCD preferred default of 1:1 pixel-mapped 8-bit RGB working correctly: [list<][*<]Nvidia drivers do it wrong by default on HDMI connections using limited 16-235 colour ranges. [/*<][*<]Overscan (for VHS cassette deck compatibility) is still a common irritation on many devices with either AMD or NVidia hardware. [/*<][*<]The most basic of all colourspace standards (sRGB) is still rarely met by displays, even when calibrated (which few people ever bother with). [/*<][*<]Most displays come with stupid image enhancements with stupid marketing names like "britecolor" and "hd trueview" that completely ruin the picture accuracy by oversaturating colours, ruining gamma curves, crushing either the blacks, whites or both and (if it's a TV rather than a monitor) probably providing some measure of "edge enhancement" and "noise reduction" and most likely dynamic contrast as well, just to add salt into the wound.[/*<][/list<] When I can pick any monitor on sale at random and have a >50% chance of getting a 1:1 pixel mapped 8-bits per channel RGB output, then I'll consider HDR as a viable next-level goal, but the market as a whole is still a long way from this at the moment πŸ™ In the meantime, HDR is just one more fustercluck of confusing marketing spiel for manufacturers and vendors to add to the list of things they can screw up as badly as the other basics they repeatedly fail to get right.

      • Voldenuit
      • 3 years ago

      [quote<]When I can pick any monitor on sale at random and have a >50% chance of getting a 1:1 pixel mapped 8-bits per channel RGB output, then I'll consider HDR as a viable next-level goal, but the market as a whole is still a long way from this at the moment πŸ™ [/quote<] Welp, there's your problem. If you want accurate color for photography or video workflow, invest in a colorimeter (really, dude, it's $99 on amazon) or a $2,000 professional monitor. And spend 5 minutes researching any product that is critical to your needs instead of buying, as you say, at random. 99% of consumers prefer a saturated, contrasty image instead of a flat profile or accurate colors. And since their goal is typically media consumption, that's a fully justifiable viewpoint. If they were publishing National Geographic or TIME magazine from their basement, it'd be a different matter.

        • Chrispy_
        • 3 years ago

        You’re confusing colour accuracy for the size of the colour gamut. Sure, colour accuracy is nice to have – as the cherry on the cake – if you can get it without compromising the gamut, but the accuracy and the size of the gamut are entirely different things.

        I’m sure you’re familiar with [url=https://d1w5usc88actyi.cloudfront.net/wp-content/uploads/2013/02/ColorRange.jpg<]this[/url<] [u<][b<]representation[/b<][/u<] of colour gamuts: The smallest, most pathetic triangle (sRGB) is what consumer monitors aim for, but most of them fail miserably to acheive. [i<]Good[/i<] VA and IPS monitors can usually just about cover 100% of this gamut but can't exceed it without a better backlight; Whether that's quantum dots, cold cathodes, or even RB+G LEDs, it takes a wider-frequency backlight than WLEDs to give you extended gamut - or in other words [b<]more vibrancy[/b<] beyond the sRGB colourspace. Since I can clarify that, I will; A calibrated, narrow-gamut screen is exactly what delivers this "flat profile" you say consumers don't like. Wider gamut = redder reds, greener greens, and bluer blues, so a wide-gamut screen has the potential to deliver more vibrant images [i<]and[/i<] be accurate at the same time. Where it all falls down with this fallacy of a "flat profile" is that people prefer "saturated, contrasty images". This [b<]DOES NOT[/b<] mean that people like contrast and colour space saturation (clipping of their colour space). What it means is that they want to experience the vibrant colours of the full gamut of human vision (represented by the outer curved bell-shape in that image I linked above). By boosting the saturation and contrast of an image on an sRGB screen, you are effectively stretching the gamut wider until the point at which it clips either the contrast or colour. On an wide-gamut, AdobeRGB-capable screen you have to boost the saturation and contrast less to get the same results, therefore you can get both higher accuracy since the boost to colour saturation is less obvious and you can cover more of your eye's visible gamut before it clips at the edge of the colour space where the screen runs out of vibrancy. It can't display a green any greener than the green of its quantum-dot phosphor, or the greenness of the combined backlight and green filter in each green subpixel. [list<][*<]Oversaturating a screen with a low-gamut is a kludge to improve vibrancy caused by the screen's lack of it, but it goes wrong with colour/contrast clipping once the source need a colour more vibrant than the low-gamut screen can provide[/*<][*<]HDR tone-mapping and dynamic colour spaces are kludges to reduce the horrible obviousness of colour-clipping on a saturation-boosted low-gamut screen.[/*<][/list<] [b<]TL;DR[/b<] So, firstly, most manufacturers can't be bothered to even make wider-gamut screens. We're still seeing WLED TN screens with less than 75% of the sRGB colour space and 99% of the VA and IPS models are still WLED so they only get to about 98% of it. Colour accuracy goes to hell as a result of the saturation-boost kludge, and then HDR is a kludge for the kludge. Expecting [b<]EVERYONE[/b<] in the chain to get this new HDR kludge right is a lot to ask, probably [i<]too much to ask[/i<] given that lots of those people in the chain usually screw up the far simpler 0-255 RGB stuff we've been using for two decades. The artist, the developer, the screen manufacturer, Nvidia/AMD/Intel, and of course your OS vendor all have the independent potential to screw up the whole chain, and it actually requires everyone to get it right for any of it to work! Can't we just have less awful screens please?

          • Voldenuit
          • 3 years ago

          [quote<]Good VA and IPS monitors can usually just about cover 100% of this gamut but can't exceed it without a better backlight; Whether that's quantum dots, cold cathodes, or even RB+G LEDs, it takes a wider-frequency backlight than WLEDs to give you extended gamut - or in other words more vibrancy beyond the sRGB colourspace. ... Can't we just have less awful screens please?[/quote<] I think you answered your own question here. Increasing the color gamut requires more advanced hardware than the mass market is willing to bear currently, and has diminishing returns for the consumer. As more and more media is created for the various HDR standards out there, more demand will eventually lead to more products at better prices. There are already professional monitors that are designed for and around wide color spaces, but they're not cheap. It's not surprising that TVs are leading the way in terms of wider gamut displays though let's be honest, their color accuracy is typically worse than computer monitors), because they are a much larger market with more competition and direct interests in promoting the standards (Dobly HDR, Hi-10, etc). With media companies actively working to sabotage UHD and HDR content on the PC (because they want their walled gardens), I predict consumer uptake of UHD+HDR monitors on PC to be a bit slower.

      • jihadjoe
      • 3 years ago

      [quote<]Nvidia drivers do it wrong by default on HDMI connections using limited 16-235 colour ranges.[/quote<] Is this still true? I have an HDMI-connected Dell Ultrasharp and it set itself to 0-255 without me doing anything.

        • Chrispy_
        • 3 years ago

        It’s a two way thing. Depends on the monitor/TV but if the monitor *doesn’t* specifically state that it’s expecting 0-255 Nvidia defaults to 16-235.

          • jessterman21
          • 3 years ago

          yep, every time I update drivers on my HTPC I have to reset it to 0-255

    • kloreep
    • 3 years ago

    Has AMD said if Freesync 2 will also be submitted for adoption as open standard(s) like Freesync was released as Adaptive Sync? Or is it going to be a turn toward the proprietary?

      • DPete27
      • 3 years ago

      Think of FreeSync 2 as equivalent of FreeSync, but with minimum requirements. As stated in the article, both classifications will continue to co-exist.

    • stefem
    • 3 years ago

    [quote<]they'll have to undergo a strict certification process.[/quote<] Hope this will improve the average quality of the panels and overdrive applied

    • DragonDaddyBear
    • 3 years ago

    I liked how AMD approached Vulkan (“donated” Mantle) and FreeSync (which is more or less VESA Adaptive Sync). I hope they continue the trend.

      • Wild Thing
      • 3 years ago

      That is dangerously close to a positive,pro AMD comment.
      You will now be hated here.

    • cygnus1
    • 3 years ago

    I know this is probably a pipe dream, but with the required features (namely LFC) more closely lining up with G-Sync capabilities, maybe nVidia will support this version of FreeSync.

    But since I’m definitely a pessimist, I’m also figuring FreeSync 2 monitors are going to have a price premium similar to G-Sync. If that happens nVidia support will only really matter at that point if the monitor manufacturers continue their trend and make more FreeSync 2 models than G-Sync.

      • VincentHanna
      • 3 years ago

      I have no doubt that Nvidia will support some type of color render matching and expanded color profile support. I am more or less indifferent as to whether they choose to support whatever standard AMD is using, and have no idea why it would be preferable to use the freesync 2 standard in leiu of whatever not-even-announced technology nVidia will eventually bring to bear to counter them.

        • AnotherReader
        • 3 years ago

        For optimal results, it seems that the game engine also needs to be aware of this functionality. If that is the case, then we would all benefit from a common standard.

      • DPete27
      • 3 years ago

      The “G-Sync Price Premium” is not related to panel specs like color gamut or refresh range, it’s in reference to the G-Sync module that has to be installed in a G-Sync monitor that introduces additional hardware AND licensing costs above and beyond panel specs.

      Yes, a FreeSync 2 monitor will certainly cost more than a FreeSync 1 monitor, but that’s going to be largely driven by differences in panel specs (ie 40-100Hz and HDR vs 40-75Hz and no HDR)

        • Voldenuit
        • 3 years ago

        [quote<]The "G-Sync Price Premium" is not related to panel specs like color gamut or refresh range, it's in reference to the G-Sync module that has to be installed in a G-Sync monitor that introduces additional hardware AND licensing costs above and beyond panel specs.[/quote<] Don't forget that every G-sync monitor also has to have a backlight and panel that can support ULMB at 120 Hz or higher, so there may be [i<]some[/i<] additional hardware cost. Still not enough to justify the current price premiums, IMO, but once Vega comes out and 120+ Hz actually becomes relevant to Freesync users, we might see more price competition between the two options.

          • Airmantharp
          • 3 years ago

          This is wrong- it’s not *every* panel, as there are 60Hz and 100Hz monitors with G-Sync that *do not* support ULMB. See the 34″ ultra-wides for one example.

          (it may be more correct to say that if the monitor is itself natively able to hit 120Hz+, then it may be required to support ULMB, but I’ve done no research on that)

            • Voldenuit
            • 3 years ago

            You are correct, as the 4K G-Sync monitors were 60 Hz and didn’t support ULMB as a consequence. Didn’t know about the ultra widescreen models, a bit surprised there.

    • Kretschmer
    • 3 years ago

    After all the issues I’ve had getting Freesync to work consistently, I’ll be skipping FreeSync2 hardware. I really appreciate AMD for providing adaptive sync without a $200+ markup, but the drivers mar an otherwise-impressive feature.

    Current oddity: games that do not work with adaptive sync in Fullscreen mode but do work in Fullscreen Windowed mode. Go figure.

      • Voldenuit
      • 3 years ago

      [quote<]Current oddity: games that do not work with adaptive sync in Fullscreen mode but do work in Fullscreen Windowed mode. Go figure.[/quote<] That does sound very weird. Last I heard, FS doesn't work with borderless windowed, so if a game isn't working in VRR for you, perhaps BLW is "fixing" things by turning off VRR. I've been pretty happy with my own G-Sync experience. There have been a few minor bugs, like Flash video flickering on my display running @ 120 Hz, but there were workarounds (strangely, switching focus or taking the mouse cursor off the main display helped), and I haven't even encountered the issue in the last couple of driver releases, so maybe it's been fixed. Then there are games that simply don't like high refresh - *cough*Skyrim*cough*Fallout*cough, and Rocksmith 2014 would lag and introduce distortion at 144 Hz; 120 Hz was fine, though. But for the most part, G-Sync has been fully plug-and-play for me. Now, the shoddy state of nvidia WHQL drivers in the past few months is another issue altogether...

        • Kretschmer
        • 3 years ago

        “That does sound very weird. Last I heard, FS doesn’t work with borderless windowed, so if a game isn’t working in VRR for you, perhaps BLW is “fixing” things by turning off VRR.”

        I can see my monitor adapting refresh rate with the OSD. In these titles, Fullscreen = Flat 144Hz; Borderless Windowed = Variable.

        I believe that there was a driver update in 2016 that enabled FS in windowed mode.

          • tay
          • 3 years ago

          Yes FreeSync is enabled in borderless windowed. Yes it is finicky no matter what the mode.

          • Voldenuit
          • 3 years ago

          Ah, thanks for the update, I was googling to see if AMD had introduced VRR to windowed modes, and not turning up any hits. Good to see that it’s been added.

      • RAGEPRO
      • 3 years ago

      I just got my first Freesync display and yeah, I have to say, it’s really … I mean, even more than inconsistent, it’s just -[i<]weird[/i<]-. Some games will only Freesync with Vsync off, some will only do it in fullscreen, some will NOT do it in fullscreen, some won't do it until you alt-tab out and back in... it's all really inconsistent and arbitrary. Freesync even breaks some games! I don't really think it's a driver problem per se, I think it's more that software ultimately just needs to be aware of adaptive refresh. I suppose drivers could work around it, though.

        • Voldenuit
        • 3 years ago

        [quote<]I don't really think it's a driver problem per se, I think it's more that software ultimately just needs to be aware of adaptive refresh. I suppose drivers could work around it, though.[/quote<] G-Sync seems to be "plug-and-play" for most users, though. Perhaps this has to do with how the G-sync module has a hardware frame buffer, so games can simply run as though V-sync was "off", and output frames whenever they like to the framebuffer. Even so, many games have no trouble running in v-sync mode on a G-Sync display, and the monitor will seamlessly account for delayed frames and adjust the framerate accordingly. Makes me wonder if "mobile" G-Sync is as finicky as FreeSync seems to be, since they are essentially using the same technologies there.

          • LostCat
          • 3 years ago

          FreeSync seems pretty plug and play here. I’ve been wondering if it’s part of the Win10 display chain now. Β One of those things MS doesn’t really talk about directly.

    • meerkt
    • 3 years ago

    HDR and color range? They’re just losing focus with their branding and initiatives. They should call it something else and let FreeSync 2 indicate just vsync capabilities.

      • DPete27
      • 3 years ago

      Ay, but just like Freesync 1 was an open book for VRR, FreeSync 2 will probably be an open book for HDR coverage.

      Still, nice to see that they’re improving Freesync tech and adding some ease to consumers by establishing performance tiers (Freesync 1 vs FreeSync 2)

      • EndlessWaves
      • 3 years ago

      I suggest ‘Radeon Ryzen’

        • Voldenuit
        • 3 years ago

        How about “Wideon”?

        Short for Wide Gamut Radeon.

        “Hey kid, you wanna play those sick Wideon games?”

        • derFunkenstein
        • 3 years ago

        Ryzeon

          • SHOES
          • 3 years ago

          Fryzynceon

        • meerkt
        • 3 years ago

        RadeonEDID+VRR? πŸ™‚

      • Wirko
      • 3 years ago

      FreeRange?

        • morphine
        • 3 years ago

        I only buy FreeRange monitors, hmph!

          • derFunkenstein
          • 3 years ago

          They better be antibiotic-free, too. Completely organic LEDs.

    • Voldenuit
    • 3 years ago

    No mention of blur reduction technologies (if any)?

    I love variable refresh rates, but I’ve found that I get even more mileage out of ULMB, which sadly means I have to give up VRR.

    Perhaps as panels (and their pixel latency and persistence times) continue to improve, the need for ULMB/Lightboost will fall by the wayside, but I don’t feel we’re there yet.

      • RAGEPRO
      • 3 years ago

      Even if you have perfect 1ms response time for every single transition you’ve still got the sample-and-hold blur problem. Until we hit insanely high refresh rates (>300 Hz) we’re gonna need ULMB and similar technologies to get truly clear motion out of LCDs.

        • Voldenuit
        • 3 years ago

        Good point, and even with super high refresh, sample and hold will still present as “blur” unless there are actual frames of motion in the updates (whether said frames are real or interpolated motion).

      • Chrispy_
      • 3 years ago

      ULMB is highly underrated and I agree it makes a huge difference, enough that I disable G-Sync to use it in almost every situation I can.

      I’ve often posted about the reasonably fast/cheap/simple algorithm that would allow ULMB and VRR to be done at the same time, it’s just that ULMB seems to be an afterthought so nobody has bothered implementing it.

      Another issue with ULMB is that it really doesn’t work at all if you skip a frame. At 120Hz With ULMB my [s<]eye[/s<] brain sees infinite framerate because it's filling in the gaps between strobed frames by itself. When the game can't maintain a frame every 8.3ms (120fps) it'll double up a frame to 16.7ms (60fps) and that would seem smooth on a fixed 60Hz monitor, but with ULMB it's jarring because my eyebrain is trying to interpolate between strobes but the strobe still fires with the old image, which totally upsets the [s<]applecart[/s<] illusion of fluidity. Basically, ULMB pretty much REQUIRES vsync off, because even the best GPU and low graphics settings can stumble on badly-coded, rushed-by-the-publisher, console ports that qualify as AAA games by Squarethesda Cryb[i<]ea[/i<]soft these days.

        • npore
        • 3 years ago

        Yeah, it’s so underrated and would have hoped to have seen some implementation of it with VRR by now. I think Nvidia filed a patent for combining them in 2015. Seem to be quite a few different ways of doing it, some of which seem quite simple.

        I know what you mean about the jarring if you drop below 8.3ms. I find I’d rather take the odd drop with vysnc on if I can stay under 8.3ms most of the time – ULMB with vysnc is too nice πŸ™‚ But yeah, there are a lot of games where it’s not possible; and it does seem (at least to a non expert) to be down to poor coding, when it’s possible in a complex game like BF1.

        • Voldenuit
        • 3 years ago

        [quote<]I've often posted about the reasonably fast/cheap/simple algorithm that would allow ULMB and VRR to be done at the same time, it's just that ULMB seems to be an afterthought so nobody has bothered implementing it.[/quote<] Can you repost a link pls? I'm wondering how ULMB would get around the VRR problem, as varying frame times would mean you'd have to dynamically vary the strobe intensity to get a consistent global illumination. I suppose you could get around it by buffering a frame and displaying every frame 1 frame late, but I wouldn't like the added latency that would incur? EDIT: Also, I agree that the perceived stutter from ULMB can be a bit jarring, but I like using it not for smoothness, but for improved image fidelity during fast flicks in FPS games.

          • psuedonymous
          • 3 years ago

          Without an ACCURATE future prediction of how long the NEXT frame will take to render, the only way to guarantee illumination persistence times inversely proprotional to instantaneous frame delivery rate is to buffer frames for at minimum the desired maximum inter-frame interval (e.g. a minimum buffer of 33.33ms for a minimum FPS of 30).
          This may work just fine for normal use desktop monitor, but the more you chow down that buffer, the less your framerate can dip below that minimum before you start getting brightness variations. And ANY buffering is a complete no-go for VR.

          • Chrispy_
          • 3 years ago

          No links, they’ll just be random comments to VRR news and articles here on TR most likely.

          The biggest hurdle by far is that varying the strobe refresh rate will affect the brightness, and you’re right that buffering a frame will add latency, so that’s not the solution. The thing to remember it that perceived brightness with ULMB is the sum of three factors: The frequency of the pulse, the duration of the pulse, and the intensity of the backlight.

          Now, it’s probably easier to keep the backlight intensity fixed, so that all the firmware/ASIC has to calculate is how long and when the light is pulsed. Since retinal persistence (decay) is much slower than photon detection (activation) you can afford to compensate for the delay since the previous frame by making the next pulse duration longer or shorter as required.

          This would introduce barely-perceptible brightness flicker if your frametimes are all over the place every other frame, but thankfully with VRR the difference between frametime changes from one frame to the next are more subtle. Sure you may think that your framerate varies quickly, but if it drops from 100fps to 30fps in the space of half a second, you’re still several dozen frames in a frametime plot to somewhat smooth this framerate transition. You’re looking at single-digit percentage changes in the time difference from frame to frame [i<]at most[/i<], which means you are only going to need to change the strobe length by single-digit percentages and this will result in single-digit percentage brightness changes between frames. This means that the next frame is drawn immediately but at the brighness required for the previous frame. There's no input lag because your retina proteins do the buffering, and the algorithm for calculating the strobe duration is latency-free because it can use the same lookup table system that overdrive applies. The only difference between overdrive lookups and pulse duration lookups is that the overdrive is based on the previous frame's [i<]colour[/i<] and the pulse duration is based on the previous frame's [i<]timestamp[/i<]. It's also one lookup per frame rather than one lookup per pixel so much cheaper/easier to implement too. To put that into more context, the retinal persistence decays so slowly that during pure-white screen monitor hardware reviews, brightness uniformity of 80% from edge to corner is perfectly acceptable. Even on a full-bright, static, long exposure image, a 20% brightness change isn't really noticeable in the time it takes us to sweep our eyes from the centre to the edges of the screen. I'd need to see it in practice, but I can't believe that small variations to brightness caused by a 1-frame sync lag would even register unless your output was horrifically wrong. This system would absolutely flicker if you had severe microstuttering, but if your frametimes are 8ms, 60ms, 4ms, 50ms, 5ms, 75ms etc - you have far more serious issues than your brightness flickering slightly.

            • Pwnstar
            • 3 years ago

            Wow. Intel, AMD or nVidia should hire you.

    • JustAnEngineer
    • 3 years ago

    [quote<] FreeSync 2 displays will also be required to have Low Framerate Compensation capability. [/quote<] Ding! Ding! Ding! That's the ticket to finally killing off NVidia's expensive proprietary G-Sync monster.

      • drfish
      • 3 years ago

      [quote<]That's the ticket to finally [s<]killing off[/s<] performing consistently similarly to NVidia's expensive proprietary G-Sync monster.[/quote<] FTFY Oh, and don't forget that Nvidia still has to give a crap about it or we'll still have two "standards." At least Freesync 2 is a much-needed improvement.

        • DragonDaddyBear
        • 3 years ago

        Except it’s really a proposed proprietary standard riding in the open (“free”) standard of VESA Adaptive Sync. So, really, it will be 3 standards.

        • gerryg
        • 3 years ago

        Didn’t I read somewhere that Intel is adopting Freesync? Or GCN cores? Both? If Intel and AMD end up on the same page with compatible standards, the mass market will certainly move there. Nvidia would either have to cave and adopt compatibility, or go all in with more “premium” features in order to justify the extra price. Maybe they do both?

          • Airmantharp
          • 3 years ago

          I’d expect Intel to adopt Freesync, but because their designs happen so far ahead of their production, it’s going to be a while.

            • DancinJack
            • 3 years ago

            They said they were going to. [url<]https://techreport.com/news/28865/intel-plans-to-support-vesa-adaptive-sync-displays[/url<]

            • Airmantharp
            • 3 years ago

            I have a feeling you’ve corrected me on this at least the second time- sorry!

            • DancinJack
            • 3 years ago

            Haha nah, at least not that I recall. πŸ™‚

          • jts888
          • 3 years ago

          Intel said it would eventually support VESA’s Adaptive-sync, which is really just the wire protocol tweak to DisplayPort (and now HDMI too I guess), while FreeSync is also the GPU/driver side support that takes advantage of variably timed refresh intervals.

          The other thing you might be thinking of is that Intel is now going to be licensing AMD’s GPU patents instead of Nvidia’s. There was a 2011 lawsuit settlement where Intel would pay Nvidia $1.5B over six years and get access to some patents (as well as a few from Intel to Nvidia), but it was really more about damages from killing off Nvidia’s nForce chipset business.

          Don’t expect Iris Pro to change substantially because of this, it’s just IP protection money going to a different partner now.

          • Voldenuit
          • 3 years ago

          Intel signed on to Freesync/VESA Adaptive Sync a long time ago, but two years later, still don’t have it implemented on their IGPs afaik.

          • Andrew Lauritzen
          • 3 years ago

          This whole thing is now massively confused by AMD calling this “Freesync 2” still. As far as I can tell from this press, “Freesync” is now just being used as an AMD branding of any display technologies. This is both incredibly confusing and muddies their argument about “open standards”.

          So yeah it’s pretty clear “Freesync” is now just as proprietary as “G-sync”… the only “open” bit is the VESA adaptive sync spec I guess. So I guess there’s no way for Intel or NVIDIA to support “Freesync” as the term now means today, just the former piece of tech.

          Hopefully I’m misinterpreting this but I see nothing to indicate any of this new stuff is meant to be an open standard, and the naming is mindbogglingly stupid if not.

      • DPete27
      • 3 years ago

      I’ve spent the last 1.5 months gaming on my brother’s [url=http://www.newegg.com/Product/Product.aspx?Item=N82E16824025112<]LG 29UM67P[/url<] with FreeSync range of 40-75Hz (no LFC) and can honestly say it wasn't hard to dial in game settings to stay consistently inside that range**. Moreso now with AMD's Framerate Target Control and Chill driver features. With all the framerate consistency work being done as a result of Scott's Inside the Second stuff, I'm not sure having LFC is as big of a deal as people make it out to be. **Granted I'm running a 2560x1080 monitor with an RX480 so I have PLENTY of GPU power to push framerates in excess of 75fps even at max settings so I'm actually using FTC to cap delivery at 75fps to avoid tearing on the high end. But PC games have a variety of graphics quality settings that can be tweaked to fit lower-end GPUs into a given FreeSync range.

        • LostCat
        • 3 years ago

        My new mon is supposedly 1440p@48-75 with LFC though the jury is out on if those specs are accurate. It’s been amazing since I got it, definitely not buying anything without Freesync again.

        • Chrispy_
        • 3 years ago

        The 29UM68P and 34UM68P both handled under/overclocking admirably down to 30Hz and up to 80Hz, which means you can probably have LFC if you want it. The tool you want is ToastyX’s CRU and it’s a doddle.

        Honestly though, even with LFC working, there’s not a lot of point in framerates below 40Hz anyway, for the same old reasons that low framerates have always sucked, even before VRR came around. Who wants to game a 25fps anyway? This isn’t 1992 anymore!

          • Flapdrol
          • 3 years ago

          I’m running the 29UM68, I have vsync turned off, so it starts tearing above and below the freesync range, LFC wouldn’t do anything with vsync off anyway.

          I just cap fps to stay ~10% below the 75 max (using ingame limiters if possible) and below 40 the tearing isn’t as noticable due to the poor performance.

          But demanding LFC capability at least makes sure screens have a usable range. Some freesync screens have 48-60 hz, that’s a bit too tight imo.

Pin It on Pinterest

Share This