FreeSync 2 is coming to the Xbox One this spring

On Saturday, Microsoft launched a new series of monthly livestreams called Inside Xbox. The first episode had details on Sea of Thieves, PixArk, and the Xbox version of PlayerUnknown's Battlegrounds. However, the most interesting information was about the next system software update for the Xbox One S and Xbox One X consoles. According to Microsoft, Xbox gamers will be able to enjoy AMD's FreeSync 2 variable-refresh-rate (VRR) tech when the update hits later this spring.

Along with support for VRR, the upcoming update will also enable HDMI 2.1's Auto Low Latency Mode. Xbox systems will be able to tell displays to enable or disable their respective game modes automatically when the user starts or exits a game. Notably, Microsoft says that using either feature will require "a supported TV or monitor." We're not aware of any FreeSync-branded TVs, but we'd wager that the HDMI 2.1 VRR spec is a real close relative of the FreeSync-on-HDMI support already available on some monitors. Microsoft pledged support for the then-unreleased HDMI standard before the Xbox One X came out, and it seems that the company is taking steps toward making good on its promise.

We PC fans tend to think of FreeSync in association with high-refresh-rate gaming displays, but the technology arguably offers the greatest benefit at low frame rates. Most FreeSync displays don't enable the variable-refresh technology below 48 Hz or so unless their top-end is over 120 Hz—a characteristic that lets them support AMD's Low Framerate Compensation. LFC requires support from the graphics driver as well as the monitor, so it will be interesting to see how well the system's mostly-30-FPS Xbox games work in FreeSync mode.

Besides the new tech, the Xbox One is getting myriad minor improvements in the update. Users will be able to share clips and screenshots directly to Twitter, and the Xbox's built-in Edge browser is getting a functionality upgrade that will make it behave more like the desktop version. If anyone happens to use Mixer for game streaming, they'll be able to let viewers take control of the game remotely using a virtual or physical controller. Microsoft says the update is coming "later this spring."

Comments closed
    • Chrispy_
    • 2 years ago

    Freesync 1, Freesync 2? What does it matter.

    What AMD needs (needed) was LFC as a mandatory feature – where the lowest range was at least half of the highest range.

    That means 30-60Hz, 48Hz-100Hz, etc.
    If anything, Freesync 1 was hurt by shoddy 48-60Hz implementations that couldn’t support LFC

    Given that television manufacturers are always happy to overclock their panels and add processing to give a pseudo-“400Hz” I’m pretty sure they won’t be playing the silly 48-60Hz card if they ever do adopt this. One can hope, at least!

    • UberGerbil
    • 2 years ago

    It’s true that this is pretty meaningless until TV mfrs adopt it as well, but consider: we’ve been in a chicken-and-egg situation. Neither console makers nor TV makers want to put any money/effort into features that will never be used, and both of them looked to the lack of support by the other as a reason why they didn’t adopt this tech. But now Microsoft has decided to be the chicken, and a pretty high-profile one as well. If random ScreenCo decided to adopt FreeSync it wouldn’t be noticed; Microsoft can’t move industries the way it could in its PC heyday, but this will get some consumers (vs tech nerds) asking the TV makers when they’re going to adopt it.

    (Also: Imagine if Sony had decided to be first. How likely do you think it would be they’d find some way to make it only work — or only work [i<]well[/i<] -- on the Playstation if you connected it to a Sony TV?)

      • Shobai
      • 2 years ago

      What, like Gsync for AMD GPU?

      • LostCat
      • 2 years ago

      I suspect HDMI 2.1 VRR will change the game for all hardware manufacturers. AMD and NVs proprietary crap will probably vanish, and everything supporting 2.1 will support it.

        • Zizy
        • 2 years ago

        Yes and no. AMD’s proprietary crap = branded DP’s VRR. Supposedly certified, but given variation in screen quality it might as well not be. NV’s proprietary crap should indeed vanish.

        HDMI 2.1 will really change the game though. Essentially a copy of DP, just with higher frequency (needing new cables), so we finally have a modern spec industry cares about.

          • LostCat
          • 2 years ago

          Freesync 2 is proprietary either way. I admit I don’t know the details but I’m happy if this entire mess just goes away.

            • cygnus1
            • 2 years ago

            No it’s not. A FS2 monitor is only implementing open standards. Check all the right standards off the FS2 list, and you get a FS2 badge for your monitor.

            • Ryu Connor
            • 2 years ago

            FS2 HDR is proprietary and requires API support in the game engine. FS HDMI is proprietary.

            Only a FS2 monitor being used in FS1 mode via DisplayPort is not proprietary.

            • cygnus1
            • 2 years ago

            It was my understanding both of those features were being included in HDMI 2.1 and a revision of DP (can’t recall which one, one or both may already be included in an existing revision).

            • Ryu Connor
            • 2 years ago

            I have seen no confirmation of that at all. If you have articles or interviews from reputable sources I would love to see it.

            FS HDR does not adhere to any other TV HDR standard (HDR10 or Dolby Vision) and the fact it needs game engine support makes the situation even stickier. FS HDR is a different color space than both HDR10 and Dolby Vision. It’s a tweaked sRGB, not Rec. 2020 or DCI-P3. This means hardware manufacturers, game engines, and game artists have to curate for FS2 HDR. It is the wrong color space for UHD Blu-Ray, which will expect DCI-P3 or some % of Rec. 2020. The tech might get broader industry support, but as of this time I know of no such announcements.

            No one has detailed that HDMI VRR and FS HDMI as being compatible with one another. At this moment HDMI VRR looks to be yet another VRR standard added to the market (Adaptive Sync, GSync, FS HDMI, FS2 HDR, and HDMI VRR).

            [url<]https://www.techarp.com/articles/radeon-freesync-2-hdr-gaming-tech-report/2/[/url<] [url<]https://www.anandtech.com/show/10967/amd-announces-freesync-2-improving-ease-lowering-latency-of-hdr-gaming[/url<] If you have articles that prove HDMI VRR and FS VRR are compatible, I super duper want to see that. If you have content that shows FS HDR receiving broad industry support on the content side, I'm very interested.

            • cygnus1
            • 2 years ago

            I’ve been trying to find articles that I’d gotten that impression from and I can’t seem to find it now, so I’m probably wrong. I might have assumed it was the case because of the way FS1 is essentially VESA standards for the most part.

            My understanding of the colorspace for FS2 HDR is that it doesn’t have a defined set. The AMD API talks to the monitor, takes it’s native color space, and then does the tone mapping from the game/video content directly to that native color space the monitor supports. That’s part of why it’s able to lower latency for HDR because that step doesn’t also have to happen in the monitor like it does now. So we go from Game/Video output > GPU tone mapping to HDR transport (HDR10 or Dolby Vision) > Monitor ASIC tone mapping to native panel color space and instead just have the shorter path of Game/Video output > GPU/FS2 API tone maps to monitor native color space > monitor just displays what is sent from the GPU with basically no processing.

            I’m definitely a fan of monitors and TVs doing as little processing as possible, so maybe I was just being hopeful it would be incorporated into an open standard. Just makes sense for the monitor to be able to advertise its color space support, maybe with something similar to what EDID does for resolutions supported, and let the output from your video source adjust to that.

            • Ryu Connor
            • 2 years ago

            [quote<]My understanding of the colorspace for FS2 HDR is that it doesn't have a defined set.[/quote<] It does have a defined set. It's sRGB with 125% coverage. [url<]https://i0.wp.com/www.techarp.com/wp-content/uploads/2017/01/FreeSync-2-presentation-17.jpg[/url<] This also matches a popular FS2 monitor from Samsung. [url<]https://www.samsung.com/us/support/owners/product/curved-gaming-monitor-chg70-series[/url<] [quote<]sRGB Coverage Typ 125%, Min 120% (Adobe RGB: Typ 92%, Min 88%)[/quote<] This is by design. FS2 HDR was meant to be compromise. [url<]https://www.techarp.com/articles/radeon-freesync-2-hdr-gaming/2/[/url<] [quote<]Optimised for HDR gaming, Radeon FreeSync 2 is a compromise between the wider colour gamut of HDR10 / Dolby Vision, and the low input lag of sRGB. It offers a limited HDR colour gamut that is over 2X the perceivable brightness and colour volume of sRGB.[/quote<] This of course ties back directly to the AMD slide a few links up. As noted by the tiny little asterisk in that referenced slide, it also requires the developer to implement FS2 API support in their application. So software support is required. You do get one neat benefit when or if software appears. The AMD FS2 API can do something the Windows kernel can't, dynamically shift between SDR and HDR. Of course all that potential is for not, unless some software support appears. I have my suspicions that HDR10 and Rec. 2020 will win the HDR format wars. Maybe there will be an FS3 that will adopt those. Still wouldn't fix the FS2 API being proprietary, but at least the hardware capabilities would fall in line.

            • DoomGuy64
            • 2 years ago

            What you probably read was an article on the Samsung monitor, and assumed their hybrid HDR mode was something proprietary and non-standard, when it was just Samsung’s hybrid method of rendering HDR on their panel.

        • DancinJack
        • 2 years ago

        Yeah, i mostly agree with you, but monitor makers (and TV makers) are historically super, super, super awful and slow at putting new ports on TVs and monitors. We still see HDMI 1.4 on a ton of stuff being released now. It’s disgusting.

    • JosiahBradley
    • 2 years ago

    Maybe just Maybe nVidia will support freesync if TVs start rolling out with it and HDR. Probably a pipe dream because they have BFGD…

      • derFunkenstein
      • 2 years ago

      BFGD is super-niche, though. Shield Android TV, some HTPCs, and that’s about it. OTOH how many Xbones are in the wild? Tens of millions. Same for PS4s, if Sony goes that route (which I think they have to). The living room is one place where AMD and adaptive sync can make headway without Nvidia.

        • LostCat
        • 2 years ago

        Sony and monitor tech? I’d have to see it to believe it.

        (Though I guess I didn’t believe MS would do it either.)

          • DPete27
          • 2 years ago

          At least Sony makes/sells TVs. That’s more than MS can say. Sony can profit on both ends.

            • LostCat
            • 2 years ago

            None of which relates to 1440p or Freesync yet.

    • PrincipalSkinner
    • 2 years ago

    It’s useless unless they get TV makers to do the same.

      • derFunkenstein
      • 2 years ago

      40″ FreeSync 4K monitors beg to differ. [url=https://www.anandtech.com/show/11310/lg-43ud79-b-launched<]Something like this[/url<] would be a perfect companion in a gaming room.

        • BurntMyBacon
        • 2 years ago

        I don’t see anything about LFC. The Dynamic Refresh Range is also unspecified, unless it really is 56Hz-61Hz which would be both odd and uselessly small. I’m not sure this is the monitor you should be championing for gaming.

          • derFunkenstein
          • 2 years ago

          My bad. I didn’t look closely enough. There are others out there, though.

            • LostCat
            • 2 years ago

            Spring update also brings in 1440p, making the CHG70s and many other Freesync mons an option. Though, as I’ve said, it also accepts a 4K signal on HDMI so that may not be needed.

            I’d bring the X1X over here but I like my receiver too much for it.

      • albundy
      • 2 years ago

      +1 for marketing!

      • Zizy
      • 2 years ago

      Get ready for “Gaming TV”s this holiday season 🙂
      It took a while for dumbasses leading HDMI to copy DP, but now all pieces are here.

    • DancinJack
    • 2 years ago

    Very good news.

    Now if only TV makers would add AdaptiveSync/Freesync capabilities to their TVs. That’s the real catch.

    Third, I have been a BIG fan of Gsync. I love my GTX 1080 paired with a Gsync screen. I have advocated for it despite the cost because it actually has been a superior solution to Freesync. Having said that, I sincerely hope Nvidia will work with others and maybe tie in some of their capability into Adaptive Sync some day. I know hardware is required for Gsync, but I have to imagine some of the mojo is in software as well. Maybe a coalition can develop a scalar at a drastically reduced cost that we can integrate into DP some day.

      • Helmore
      • 2 years ago

      What are the technical differences between GSync and Freesync that would make it superior? As far as I was aware they are pretty much the same with no discernable difference, except for costs.

        • DancinJack
        • 2 years ago

        Historically, a lot of Freesync monitors have lacked LFC which every Gsync module supports (not to mention Gsync, independent of monitor, supports VRR all the way down to 30Hz). It may not seem like a big deal for twitch gamers that run CS:GO at 120+, but when you’re playing something at 1440p on Ultra, there can be some dips.

        The Freesync range has also not been anywhere near consistent with Freesync implementations. I have seriously seen monitors that have a 20Hz VRR range with Freesync.

        Anecdotally, I have played with both in person and I prefer Gsync. It just feels more consistent to me. Here is a decent explanation of features and capabilities. [url<]https://www.rtings.com/monitor/guide/freesync-amd-vs-gsync-nvidia[/url<] Freesync 2 definitely makes up for some of the first generation's limitations, but there is still so much variation because the implementation of Freesync is largely up to the monitor maker which results in cost-cutting measures that hurt it overall. Here's to hoping that changes.

          • DPete27
          • 2 years ago

          So the real difference is that Nvidia has a panel spec requirement for GSync, and there’s no such thing for FreeSync monitors.
          Don’t forget, there ARE FreeSync monitors that have the same specs as GSync, so essentially you’re comparing a tightly controlled monitor lineup (GSync) to a highly variable FreeSync lineup, and because the average of FreeSync monitors is less, you’re saying GSync is better……

          Don’t make the mistake of associating the VRR tech with the monitor panel.

            • DancinJack
            • 2 years ago

            “The panel” is too wide here. We’re almost exclusively talking about the scaler, which Nvidia provides specs for Gsync and AMD lets panel makers use whatever scalers they please.

            Trust me, I’m not confusing VRR with particular panels.

            edit: spelling

            • DoomGuy64
            • 2 years ago

            The only issue with the scaler was that [i<]early[/i<] panels didn't support overdrive with freesync because the manufacturers hadn't figured out how to get it working at the time. Which is no longer an issue, hasn't been for a long time, and only existed on the first edition models. LFC is completely done on the video card. You only need the proper refresh range for it to work 100%, like 50-100. Any monitor that does that or greater works 100% with LFC, while ones that don't are only partially supported. There is no such thing as a Freesync monitor not supporting LFC, because it's done on the video card.

          • DoomGuy64
          • 2 years ago

          AMD does LFC in software, and the range can be modified with CRU, unlike gsync afaik. There’s no such thing as a single freesync monitor “lacking” LFC, because that’s all done on the video card instead of the panel. The panel simply refreshes what the video card gives it within it’s range.

          edit: I suppose it is possible to not fully “support” LFC, but that’s only in cases where the panel’s refresh window is so small that the video card can’t actually do LFC. Like a 60-70hz window, which is ridiculous and shouldn’t exist. You have to be able to double the lower framerate into the upper range for LFC to work, or it doesn’t work. That said, LFC still works in that case if your framerate dips to say 32 fps. It’s not lacking the functionality, just the range.

          Freesync is consistent with the quality of the panel being used. Nvidia simply doesn’t sell cheap gsync panels, like Apple products, while AMD is more like Android with plenty of knockoffs.

          Freesync is also NOT A PANEL FUNCTION. There is NO such thing as a “Freesync” panel. There is ONLY VESA Adaptive Refresh panels, which is a VESA STANDARD. These panels label themselves as Freesync simply because there is no other use for the technology as only AMD is supporting it. Freesync 2 is no different. It’s a blanket term for a standard feature set, but only AMD cards currently use it. Since only high quality panels support F2, it gives the illusion that F2 is higher quality.

          There is no difference in functionality, there is only difference in panels. Just buy a better monitor. There are plenty of Freesync monitors on par with Gsync monitors. The only difference is that you can CHOOSE to not buy them.

          • Zizy
          • 2 years ago

          TLDR summary of rtings article:
          Gsync is consistently good and expensive, around 200$ premium vs equivalent Freesync. Freesync can be good, but also exists on junk screens that have no price premium over no-VRR screens.

        • Krogoth
        • 2 years ago

        There is no difference between the standards. The difference comes from the monitors not the interface or middleware.

        Nvidia just has tighter QC on it G-sync monitor line-up. Anyone who claims that there’s a difference between the VRR standards is a shill/rabid fanboy.

          • derFunkenstein
          • 2 years ago

          Except for being totally wrong in every assertion, this is a perfect post.

            • EndlessWaves
            • 2 years ago

            Except for containing no counter evidence, that’s a perfect rebuttal.

            • derFunkenstein
            • 2 years ago

            DancinJack posted a pretty decent link. For me to re-post it would be insulting to Krogoth’s reading comprehension.

            • DoomGuy64
            • 2 years ago

            Did *YOU* READ IT? The only thing in that link is outlining differences in the PANELS, NOT THE TECHNOLOGY.

            Arguing panel quality is a STRAW MAN. Krogoth is over 9000% right.

            If you wanna talk technical details, how about the fact that Nvidia doesn’t actually support adaptive on their video cards, because the Gsync module is doing the work of the video card with a BUFFER, while AMD does adaptive directly on the video card. LOL, that’s why Nvidia needs that expensive FPGA module, and the high cost is also why it will never exist on cheaper panels. No point in it. Meanwhile, freesync is on everything because it is just using the Vesa spec!

            If you want a quality Freesync panel, buy a quality Freesync panel. You can’t stop the cheaper panels from being made, because it is a VESA standard, and freesync is done on the videocard. All the features of Freesync are done on the video card, while panel quality and features are dependent on the panel.

            Freesync and Gsync are using completely different methods of achieving adaptive, and panels are a separate issue. Apples and Oranges.

            • derFunkenstein
            • 2 years ago

            They’re completely interchangeable. “Does this G-Sync display support LFC?” is a question that never has to be asked. “What’s the VRR range on this display?” is not a point of discussion for G-Sync.

            FreeSync displays also come in at half the cost of comparable G-Sync panels (although thanks to those unknowns, some things that seem comparable really aren’t), so there’s good with the bad. I just won’t pretend or play cheerleader for either side like…well, like some folks around here. 😉

            • DoomGuy64
            • 2 years ago

            There is no freesync panel that doesn’t “support” LFC. None. The video card does LFC.

            Now LFC does require a certain range to work, but that doesn’t mean it doesn’t work at all on those junky panels, which anyone with half a brain should be avoiding anyway. 32 fps will give you LFC on a freesync panel that does 60-70hz, while everything not in that range will either tear or get dropped.

            Simple fix: Don’t be that guy who buys a 60-70hz panel. That’s retarded, and you are retarded for even trying to make that argument. That’s not a gaming monitor, that’s a regular monitor that merely support the new display standards. The lower range has to fit in the upper range. If you want a gaming panel, buy a proper gaming panel, and there’s way more of them now than ever before.

            • derFunkenstein
            • 2 years ago

            The fact that panels with those specs can carry the FreeSync branding is shameful on AMD’s part, and trying to pretend they don’t exist doesn’t make them go away. Keep your personal attacks to yourself.

            • DoomGuy64
            • 2 years ago

            Then keep your falsehoods to yourself. Branding may be a issue, but that’s only an issue with fanboys looking for a nit to pick, because there really isn’t any other valid point to argue. Nobody else cares. I don’t think you’d change your mind even if AMD did crack down on the branding, only the point of attack.

            Hell, Freesync2 essentially is that branding, and you’re still arguing about it. IMO, it’s totally appropriate to call out that behavior. What’s not appropriate is going around spreading FUD. Just look at the monitor’s refresh range, and all of these “issues” go away. Like magic! Is that so hard?

            The problem isn’t branding. The problem is people complaining about specs that are easily avoided. It’s so easy you wouldn’t be able to complain if you didn’t know about them, now would you?

            • derFunkenstein
            • 2 years ago

            What are you calling me a fanboy of? Good hardware? Guilty as charged. Can’t be Nvidia though. I recently [url=https://techreport.com/forums/viewtopic.php?f=3&t=120694<]made a profit[/url<] by dumping my Nvidia hardware. And I know about these specs, but people buying monitors don't necessarily. Cheaping out like that must work in spades, or else the hardware would not sell. Someone's buying them.

            • DoomGuy64
            • 2 years ago

            Fanboy or not, the logical fallacy behavior is there. Maybe you just like complaining over nothing?

            Of course people are going to buy cheap monitors. Doesn’t mean they’re buying them for gaming, and ALL monitors using the new standards are going to support freesync. Is that a problem? I don’t think it is. If you want a gaming monitor, buy a gaming monitor, not a workstation or value monitor.

            The people who buy those monitors aren’t losing anything, they’re gaining. It’s one feature they wouldn’t have had otherwise. The people who buy gaming hardware aren’t those people. If you’re smart enough to build, you’re smart enough to buy a good monitor. If you buy prebuilt, it’s coming with a monitor. People who buy value freesync monitors aren’t buying them for freesync. That’s just a catchy bonus, and why it’s labeled as such. Like extra ram on low end video cards.

            • Redocbew
            • 2 years ago

            [quote<]Maybe you just like complaining over nothing?[/quote<] Hmm... Projecting in a thread about displays. There's a joke there somewhere.

            • DPete27
            • 2 years ago

            I think the worst part of the whole thing is that manufacturers aren’t clearly stating the specs of their FreeSync monitors. VRR range (__Hz to __Hz) and whether or not it supports LFC (Yes/No) should be line items in the spec sheet, but most of the time you can’t find the VRR range even on the manufacturers own website.

            GSync just makes it “easier” for people with deep enough pockets because they simply require a high spec on all models in order for the module to be compatible. I also think the $150-ish cost of the GSync module simply increases the price by too much % on lower priced/spec’d monitors to be viable/marketable.

            As you said yourself, the wider spec/price range of FreeSync monitors isn’t their weakness, its the poor communication to the customer as to what they’re getting.

            • Krogoth
            • 2 years ago

            Yep, I have seen both G-Sync and Freesync in action with quality monitors (LAN parties).

            There is no difference in terms of output assuming the GPUs in question are able to handle the framerate.

            The whole “Freesync is crap” FUD is due to lower-end panel using it. This doesn’t exist in the G-Sync world because Nvidia forces their partners to make Gsync exclusive to higher-end monitors.

          • floodo1
          • 2 years ago

          That’s great except the standard is subject to how is implemented, and in practice Gsync monitors are superior (-8

            • Krogoth
            • 2 years ago

            The difference comes entirely from monitors. Nvidia just puts tighter QC with their partners. The extra hardware in Gsync 1.0 is an artifact from a time before VESA finalized VRR over DisplayPort. It is no longer necessary with DisplayPort 1.2 or newer/HDMI 2.0.

          • chuckula
          • 2 years ago

          I’ve been busy. Thanks for filling in. Try being a little less serious-sounding though.

        • BurntMyBacon
        • 2 years ago

        The biggest issue was in implementation. Even though Freesync specified a dynamic refresh range of 9-240Hz, by comparison to G-Sync, early implementations generally had high minimum refresh ranges and lacked the ability to use LFC (Low Framerate Compensation). To guarantee the performance they wanted, nVidia required use of their own hardware for G-Sync. AMD, however, had to rely on a very limited number of scalers that supported display port adaptive sync. Scaler manufactures did the minimum required to keep costs (and risk) down. AMD envisioned a wide range of Freesync capable monitors from budget to premium and to be fair, nothing in AMD’s original Freesync specifications stopped manufacturers from building a monitor that would be fully Freesync 2 compliant. On the other hand, given one of the proposed benefits over G-Sync was lower cost, it shouldn’t come as a surprise that the majority of the focus was on cost.

        With the launch of Freesync 2, the requirements have been improved. Wider dynamic refresh ranges, LFC capability, some level of HDR support, and even a reduction in processing latency have all been specified. The should be little discernible difference in performance between a Freesync 2 (or a well specified Freesync) monitor and a G-Sync monitor. I certainly don’t think there is an appreciable difference between my G-Sync and Freesync monitors other than the fact that I didn’t have to do anywhere near as much research to find a good G-Sync monitor. I’d definitely recommend finding an up to date Freesync listing and filtering out the junk before buying. [s<]Freesync 2 should get rid of this issue.[/s<] While most of the concerning specs have been resolved, it appears that Freesync 2 still allows for high minimums on the dynamic refresh range. I don't even consider monitors with higher minimums than 40Hz. A monitor with 40Hz minimum and LFC works decently well, but people who like to crank up the quality settings and are sensitive to the crossover point (not nearly as many as I once believed) may appreciate the extra range down to 30Hz.

          • DPete27
          • 2 years ago

          FreeSync2 is a set of minimum requirements to be met in order to attain that certification, much like the “unofficial”(?) specs that Nvidia requires for GSync. Yes, the actual min frequency of FS2 is not specified, just the 2.5x difference from highest to lowest frequency in order to support LFC.

          The original FreeSync had LFC support as long as you had the 2.5x range. There will be some grey area (some monitors that have LFC, but not other requirements of FS2), but for the most part FS2 will be the premium spec that’s on-par with GSync, and FS(1) will be the other cheaper stuff.

            • cygnus1
            • 2 years ago

            I look at FS2 as slightly higher end than G-Sync if only because of the required HDR (in some format) support. But I agree, a FS2 stamp should mean a lot less digging is needed to confirm if a given monitor will make for a good gaming monitor or not.

        • DoomGuy64
        • 2 years ago

        The difference is that Nvidia doesn’t actually support adaptive on their video cards, so they made a module to do it on the monitor. Gsync was brought to market [i<]before[/i<] VESA standardized any hardware or cables, so they used a hack to get existing hardware to work. The Gsync FPGA module uses a BUFFER to refresh VSYNC OFF as adaptive. It does the work. AMD properly does it through fully compliant Vesa standard hardware. This is why GCN 1.0 doesn't support Freesync, while Kepler supports Gsync. In reality, I bet Fermi (or any other card including GCN 1.0) could also "support" Gsync, since Nvidia made the technology AFTER they had existing videocards with no change in cable or display standards. You would simply have to find out how Nvidia is talking to the module and spoof it. Everything that Gsync does on the FPGA is instead done on AMD's video cards. This includes LFC. The reason why Gsync appears "superior" is because Nvidia controls the proprietary Gsync FPGA supply and only sells Gsync with high quality panels. AMD can do NO SUCH THING, because adaptive is a VESA standard, and Freesync is done in software. If you want a comparable Freesync panel, you [i<]have to buy a more expensive monitor.[/i<] That's the difference.

      • the
      • 2 years ago

      With this move, I do see nVidia adopting the VESA VRR (basically FreeSync 1 without the branding) and HDMI VRR. This is the weight that shifts the market toward a particular implementation.

      I don’t think Gsync will die but nVidia simply cannot keep their GPU hardware exclusive to it. The raw number of FreeSync 1 and 2 displays will just be too large to ignore.

        • Krogoth
        • 2 years ago

        Nvidia already has VRR over DisplayPort spec through its mobile GPUs. They are just trying milk Gsync 1.0 until it is no longer economically viable. They did the same thing with SLI being exclusive to their chipset platforms until they were no longer able compete against Intel.

      • Goty
      • 2 years ago

      [quote<]I know hardware is required for Gsync, but I have to imagine some of the mojo is in software as well.[/quote<] Mobile G-Sync requires no extra hardware, only eDP (for VRR and PSR), to function. I'm not sure if regular old DisplayPort supports PSR, but if it does that kind of puts the kibosh on any "need" for extra hardware to do G-Sync.

        • DancinJack
        • 2 years ago

        Right, but there is still the driver implementation. As we have seen time and time again, AMD and Nvidia screw those up all the time.

        As I said above, I wish Nvidia would just join the rest of the crowd and use Adaptive Sync, but it better be as good as (desktop) Gsync.

        edit: added desktop qualifier to Gsync.

      • superjawes
      • 2 years ago

      I think getting the tech in front of people is the biggest barrier to “typical” TV adoption. Both variable refresh specs have been gaming focused since we (gamers) tend to get the biggest benefit, but the TV market at large is focused on sports, movies, Netflix, and traditional TV programming. All that media would benefit from adaptive refresh rates but to a lesser degree.

      With that in mind, the XBOne is probably a good choice to expand beyond gaming monitors. Sure, it’s still technically a gaming machine, but it does assert itself as a comprehensive media device, and if benefits start leaking out into other media, more of the “everyone else” crowd will be interested, and that could lead to another round of TV launches and purchases with adaptive refresh support.

        • BurntMyBacon
        • 2 years ago

        With the number of useless features that TV manufacturers have touted as game changing improvements, I’d be very surprised if adaptive refresh rates don’t eventually make it on the spec list. At the very least, it would provide a bullet point for marketing.

      • floodo1
      • 2 years ago

      It’s not just the scaler … for some reason, some Gsync monitors have “overclocked” lcd refresh rates that are not present on their Freesync equivalents. So in practice Gsync is more than just VRR and low latency (-8

      Freesync has a long way to go to reach functional parity

Pin It on Pinterest

Share This