Project Scorpio will support FreeSync 2 and HDMI 2.1 VRR

Late last week, Microsoft unloaded a whole bunch of information about its upcoming Xbox One refresh, currently codenamed Project Scorpio. It turns out that the company was keeping one more tidbit of info back to make sure the console got the attention it deserved. Project Scorpio will support both FreeSync 2 and HDMI 2.1's variable-refresh-rate standard.

This is great news for both PC and console gamers. The support for FreeSync 2 in particular means that the console can make full use of a display's HDR color reproduction and Low Framerate Compensation. Right now, PC monitors are the only displays that support VRR, so anyone plugging Project Scorpio into an AMD FreeSync display will see the benefits of this technology first. This move by Microsoft means that a major piece of consumer electronics is now offering support for VRR, something that should help accelerate its adoption into more displays and presumably bring the cost of the technology down in the process.

For those who like playing older games, Digital Foundry reports that even Xbox One games and Xbox 360 games played through backwards compatibility on Project Scorpio will see the benefit of variable refresh rates. Project Scorpio doesn't have a price, release date, or even an official name, but this is one more reason to look forward to the console's release, whether you're picking one up or not.

Comments closed
    • Firestarter
    • 3 years ago

    this is awesome news! now all we need is an ETA on freesync in intel GPUs (that they’re ostensibly planning)

    • Chrispy_
    • 3 years ago

    Freesync TV’s can’t come soon enough.

    Even the best TVs are [i<]dire[/i<] when it comes to input lag and refresh rate.

    • Kretschmer
    • 3 years ago

    As someone who just bought an expensive GSync monitor, I hope that FreeSync eventually improves and displaces GSync. Standards are good and fragmentation is bad.

      • TwoEars
      • 3 years ago

      Yupp, it would be nice. I feel that a a lot hinges on the performance of AMD Vega, as long as Nvidia controls the high-end market Nvidia can keep running with GSync and people will still buy their products.

      • JustAnEngineer
      • 3 years ago

      VESA standard adaptive sync (aka FreeSync) with LFC enabled performs as well as NVidia’s expensive proprietary G-Sync, but a system based on the open standard costs the consumer a whole lot less money.
      [url<]http://pcpartpicker.com/products/monitor/#A=2&sort=a8&page=1[/url<]

        • Kretschmer
        • 3 years ago

        That’s theoretically true, but GSync products often seem to deliver better specs (e.g. higher refresh rates, etc.). That could be due to product segmentation or the custom chip, and I don’t know how to comment on the difference.

    • NoOne ButMe
    • 3 years ago

    If anyone did watch the LTT podcast-thingy with Scott Wasson, there was talk about two different ecosystems for TV and monitors.

    This could hopefully combined them, leading to enhancements to either getting to both faster.

    Like HDR to monitors faster!

      • DragonDaddyBear
      • 3 years ago

      I’m with him on the 4k thing. I don’t own a 4k tv for the same reasons he mentioned. I want HDR but with this on the horizon I’m going to keep waiting.

    • southrncomfortjm
    • 3 years ago

    Big news. Really big news. How long until we get a 60inch 4K TV with VRR over HDMI? Hopefully very soon – not because I want to use one with a Scorpio, but because I’d hope it would work with my PC and RX480.

    • TwoEars
    • 3 years ago

    Scorpio is just looking better and better, impressive work by MS here.

      • ImSpartacus
      • 3 years ago

      Yeah, I’m beginning to understand why it is “late” compared to the ps4 pro. There’s a bit more going on that needs more time in the oven.

      • K-L-Waster
      • 3 years ago

      Just remember, *everything* is impressive in pre-release announcements.

    • tipoo
    • 3 years ago

    Nice. Everything should support Freesync. LOOKING AT YOU NVIDIA.

    I wonder if that weird custom displayport through hdmi jiggery that allowed the launch PS4s to support HDR would allow an update with it?

      • DPete27
      • 3 years ago

      One more nail in GSync coffin.

      • NoOne ButMe
      • 3 years ago

      Nvidia supports Freesync FYI.

        • odizzido
        • 3 years ago

        They do? First I’ve heard of it.

          • Voldenuit
          • 3 years ago

          Well, G-Sync on laptop displays uses a system identical to Freesync, but I agree it’s a bit of a stretch to say that nvidia ‘supports’ Freesync.

            • Krogoth
            • 3 years ago

            They are going to be calling Gsync 2.0 when it goes onto Laptops when Gsync 1.0 becomes hopelessly out of date.

          • NoOne ButMe
          • 3 years ago

          As Voldenuit says, Nvidia supports Async in laptops branded as Gsync. As Freesync seems to just be Async but branded Freesync I think saying they support it is fair.

          Or rather, Nvidia supports Async, which every single Freesync monitor supports.

            • DPete27
            • 3 years ago

            Async and FreeSync are absolutely not the same thing.

            • NoOne ButMe
            • 3 years ago

            As I understand it, Freesync is basically branded Async that since launch has added a few nice things like supporting HDMI.

            Freesync at it’s core is just Async.

            Or did you think by Adaptive-sync is mean in terms of the graphics card? I am talking about VESA Adaptive-Sync standard.

            • jts888
            • 3 years ago

            At its inception, FreeSync was the GPU hardware and driver support needed to make VESA Adaptive-sync useful, as well as a branding program for display makers.

            Adaptive-sync is just a relaxation of DP 1.x rules saying that a monitor could advertise a refresh rate range to a GPU and that it promised to accept and display any frames the GPU delivered that adhered to that window.

            Aside from the various driver/API hooks allowing software to push frames at varying intervals, FreeSync is also a tiny bit of new dedicated timer logic on the silicon, which is why 1st-gen GCN and older AMD/ATI cards couldn’t just add the functionality via drivers alone.

            • DancinJack
            • 3 years ago

            async USUALLY stands for asynchronous. Using the proper names would help people not be confused. It’s not that hard.

            I have NEVER seen anyone call Adaptive Sync “async.” Not once until you did.

            • K-L-Waster
            • 3 years ago

            Pfft. Since when did technical details like that matter in technology?

            • DancinJack
            • 3 years ago

            I know, it’s crazy.

            • DPete27
            • 3 years ago

            Yeah, I thought he was referring to asynchronous [compute] hence why he was claiming Nvidia supports it (although not really until Pascal)

            • jts888
            • 3 years ago

            Maxwell and Pascal both support the general principle of multiple shaders being chained together in a dependency graph and having the right ones called when they’re ready, which is what the core of “asynchronous compute” really is. However, neither supports what AMD calls “async compute”.

            What AMD does that no Nvidia product excluding GP100 (not sure this is confirmed) can is to allow instructions from both graphics shaders (those that can access fixed function blocks like geometry setup and rasterizers) and compute shaders (those that can use only the shader ALU array) to be dispatched from the same SIMT blocks on a clock-by-clock basis.

            Maxwell allows their shader array to be partitioned, with some SMs doing any combination of compute shaders and some SMs doing the graphics shading, but the repartitioning was so slow (nearly a ms for all the flushing and state reconfiguration, etc.) that it was considered nearly useless and a hazard for VR engine development. Consumer-level Pascal chips still are constrained to SM partitioning controlled by the host driver, but the change overhead has been improved by at least an order of magnitude.

            So the situation now is kind of a mess in that:[list<][*<]AMD's calling their fine grained interleaving "async compute" was poorly considered, given that the term was already used for something else. [/*<][*<]Nvidia's hardware still doesn't have the ability to do what GCN does (avoid ALU stalls on memory by juggling multiple shaders on the fly). [/*<][*<]It's not even very clear how much benefit graphics/compute shader sharing of GCN CUs really is, given that it still appears to be challenging to write shaders that can play nice with each other with the shared resources.[/*<][/list<]

            • DPete27
            • 3 years ago

            Indeed. Maxwell async compute is/was so useless that I didn’t include it in my comment for simplicity. I did have a sentence in my comment that mimics your statements that async on Pascal being a little more “half-baked” than on AMD, but again for simplicity, I gave Pascal the benefit of the doubt.

            • tipoo
            • 3 years ago

            In laptops with a built in display. Which makes them *technically* supporting it useless to system builders, and even that can be argued as per the other comments.

            • Laykun
            • 3 years ago

            OK cool, so can I plug a FreeSync monitor into my nvidia graphics card? That’s support, what you’re talking about is called “reaching”.

            • NoOne ButMe
            • 3 years ago

            Nvidia could flip the switch and launch the driver. It should be near 100% compatibility unless Nvidia deliberately wrote their Gsync mobile to not function on monitors with VESA Async scalars.

            • Laykun
            • 3 years ago

            And how does that make my nvidia card compatible with a freesync monitor? Can I flip the switch? No, therefore it’s not compatible, I’m sure it’s capable but the software simply isn’t there to support it.

            • K-L-Waster
            • 3 years ago

            There’s a world of difference between “they could support it” and “they do support it”.

            • tipoo
            • 3 years ago

            “Could” is still academic to this discussion. For system builders needs, it’s not supported by Nvidia.

          • ImSpartacus
          • 3 years ago

          Freesync and mobile gsync are both implementations of a vesa vrr standard.

          So mobile gsync and freesync have similar parentage, but they aren’t the same. They are separate implementations.

            • NoOne ButMe
            • 3 years ago

            I assume that mobile Gsync would work on every Freesync monitor, except ones that are exclusively HDMI. If any exist.

            But yes. I should have been more clear. Nvidia doesn’t support Freesync per say, they merely have a (near) identical version of the VESA Async standard which should work with every single Freesync monitor.

            Unless Nvidia went out of their way to make it so their code/platform for G-sync mobile requires them to add every single panel used individually. I could see people high up in Nvidia doing that to make sure no one “hacks” it and loses their Gsync monitor profits.

      • psuedonymous
      • 3 years ago

      “I wonder if that weird custom displayport through hdmi jiggery”

      No DisplayPort at work, just regular HDMI. While the two turn up in tandem due to being developed and released at the same time, HDR and UHD are not linked within the HDMI standard. You can have UHD SDR or 1080P (or 480P if you so wish) HDR content carried over HDMI. The PS4’s HDMI controller was just overbuilt (or built with future standards in mind).

        • tipoo
        • 3 years ago

        Check the console hacking video, they do HDMI over a displayport bridge, it’s pretty weird. They also do SATA over a USB bridge.

        [url<]http://67.227.255.239/forum/showthread.php?t=1328638[/url<]

      • jts888
      • 3 years ago

      It’s going to be interesting to see how Nvidia handles this exactly.

      FreeSync is a much simpler protocol compared to G-sync, which uses bidirectional communication back to the GPU to inform it that the display is ready to receive a frame, and Nvidia’s silicon can demonstrably support Adaptive-sync already in laptops at least.

      Will they:[list<][*<]support only HDMI 2.1 VRR on HDMI, and insist on G-sync only for higher bandwidth DP displays? [/*<][*<]stick a wrench in the works by torpedoing HDMI VRR, or litigating with G-sync patents against display makers who use it? [/*<][*<]try to add new features to G-sync to make it a premium product? (a-sync and strobing, finally?) [/*<][*<]just roll over and make "G-sync 2" equivalent to FreeSync in every meaningful way?[/*<][/list<]

        • Firestarter
        • 3 years ago

        I think they’ll stubbornly refuse to acknowledge the situation until Intel releases iGPUs with freesync enabled

    • zaeric19
    • 3 years ago

    Do we know if HDMI 2.1 VRR is an implementation of FreeSync?

    It would be great if everyone got on board with a single VRR standard. Having a major piece of hardware come out with FreeSync support besides an AMD video card could help make it the de facto standard. I don’t see anyone besides Nvidia ever using G-Sync, but Nvidia seems oddly stubborn in keeping it around.

    Edit: Grammar/clarity.

      • RAGEPRO
      • 3 years ago

      HDMI 2.1 VRR is purportedly based on the hacky Freesync-over-HDMI thing, so yes.

      • DPete27
      • 3 years ago

      From [url=https://techreport.com/news/31244/hdmi-2-1-specification-brings-60hz-8k-video-and-vrr-tech<]this article[/url<] it sounds like it's unclear whether HDMI 2.1 VRR is a variation of FreeSync or not. Let's hope it's just FreeSync. The world is broken enough with two VRR standards, let alone 3.

        • zaeric19
        • 3 years ago

        Exactly what I was referring to. So Scorpio is supporting it, but we don’t really know what it is, Microsoft surely knows things that we don’t. From the below link, it does not appear that there is much public info about the HDMI VRR. Compliance test specification should arrive in Q2 or Q3, at which point it better be known, since this would be required for compliance.

        [url<]http://www.hdmi.org/manufacturer/hdmi_2_1/index.aspx[/url<]

        • LostCat
        • 3 years ago

        It wouldn’t really be three, because Freesync and G-Sync would be forgotten about in minutes.

      • NoOne ButMe
      • 3 years ago

      According to digital foundry, they support Freesync 2. And will support HDMI 2.1 VRR once it is finalized.

        • zaeric19
        • 3 years ago

        I read the article, I know Scorpio supports FreeSync 2. I am wondering if HDMI 2.1 VRR is an implementation of FreeSync over HDMI, similar to Adaptive Sync for DisplayPort, or something new all together.

          • DPete27
          • 3 years ago

          Well there already is FreeSync over HDMI. I think it requires HDMI2.0. So perhaps the “HDMI 2.1 VRR” is the same but with added features (HDR/higher frequency range/etc)

            • zaeric19
            • 3 years ago

            It is not officially supported by HDMI though. FreeSync means that a product is certified by AMD and VESA. There is no equivalent for HDMI as VESA only covers DisplayPort.

          • NoOne ButMe
          • 3 years ago

          per Digital Foundry’s video it supports Freesync 2 over HDMI. There is no display port as the Scorpio does not support displayport.

          Freesync is an implementation of Async VESA standard. Freesync over HDMI is the implementation except over HDMI.

          I feel like I am missing something in the question.

      • rechicero
      • 3 years ago

      It’s not stubbornness, it’s business. A monitor is something that lasts a lot of years and every G-Sync monitor sold is one guy that won’t be so free to choose anything but Nvidia in the future.

      • DoomGuy64
      • 3 years ago

      “FreeSync” is not a thing, as compared to gsync. That is AMD’s branding of an open variable refresh standard. Any “freesync” branded monitor should work with any video output device that [i<]supports[/i<] variable refresh rate with [i<]that connection standard[/i<]. HDMI 2.1 VRR is most certainly in the same category as freesync, but afaik must also be supported in hardware of AMD cards to work. In other words, your AMD card must support HDMI 2.1 to use freesync on that monitor. Cards that don't support HDMI 2.1 won't work. Either way, this should theoretically be the end of gysnc, since the market is clearly going the "freesync" route. Nvidia could still refuse to support it, because it would lock high performance GPUs into gysnc. Of course, that will only work up until AMD has a competitive card, then there will be a huge shift away from Nvidia since the market has already decided to go the other way. The only thing that could hurt AMD is not supporting HDMI 2.1, but by the time these monitors are released I think support would be there.

    • Airmantharp
    • 3 years ago

    It’s about time- now we just need full-featured HDTVs to ship with it.

      • southrncomfortjm
      • 3 years ago

      As a couch-and-controller based PC gamer (don’t judge me!) I couldn’t agree more. If this came to an OLED TV, I don’t think I could stop myself from throwing gobs of money at it.

        • Airmantharp
        • 3 years ago

        That’s the schtick: finding an HDTV with HDMI Freesync and a full HDR featureset- *and* support for 120Hz.

        The problem with OLED is that the technology has trouble getting bright enough for good HDR (10,000 lumens appears to be the top end, but 1,000 is the minimum and there are various standards between).

        Samsung’s latest LCD respin is likely the best we’re going to get in that regard.

          • NoOne ButMe
          • 3 years ago

          1000 is the target for TV, 4K the total target, and 10K the max target.
          If I read the different specs right.

          • Laykun
          • 3 years ago

          No TV will even come near the 10,000 nit max. You’re looking at around 1,500 nits best.

          • southrncomfortjm
          • 3 years ago

          VRR and infinite contrast ratios are more important than full HDR support for me, so OLED would win even without top notch HDR.

          Maybe I’m lucky in this sense – I haven’t seen HDR in action, so I can easily dismiss it.

Pin It on Pinterest

Share This