Intel plans to support VESA Adaptive-Sync displays

IDF — In a Q&A session this afternoon, I asked Intel Fellow and Chief Graphics Software Architect David Blythe about Intel's position on supporting the VESA Adaptive-Sync standard for variable refresh displays. (This is the standard perhaps better known as AMD's FreeSync.)

Blythe indicated that Intel is positively inclined toward standards-based solutions like Adaptive-Sync, and he said Intel does indeed plan to support this optional extension to the DisplayPort spec. However, Blythe wasn't yet willing to commit to a timetable for Intel enabling Adaptive-Sync support in its products.

The question of a timetable is complicated by whether Intel's GPU hardware will require an update in order to enable Adaptive-Sync capability. A source familiar with the matter has indicated to us that this feature is not present in current hardware, so in all likelihood, Adaptive-Sync support will have to wait until at least after the Skylake generation of products.

Supporting Adaptive-Sync would be a natural next step for Intel, whose integrated graphics processors stand to benefit tremendously from displays with a more forgiving and flexible refresh cycle. Intel's backing would also be a big boost for the Adaptive-Sync standard, since the firm ships by far the largest proportion of PC graphics solutions.

Comments closed
    • tygrus
    • 4 years ago

    The old monitor refresh rate “Hz” meant that it took just under the 1/##Hz of a second to refresh eg. 60Hz = approx 16.6666666ms. An adaptive-sync method means that frames are produced/updated based on the content & processing. Even if the source is 24fps, once the frame is created internally it may only require 5ms to send the update to the display over a fast display link. The higher the resolution, more bits/pixel and greater screen % changed : the longer it will take to transfer the new frame from the internal buffer to the external display. The old method still takes just under 16.66ms to transfer an updated frame to a 60Hz LCD.

    • Krogoth
    • 4 years ago

    Sad that Nvidia defense force in out in force.

      • JustAnEngineer
      • 4 years ago

      Par for the course.

    • USAFTW
    • 4 years ago

    Oh, thank God! Better yet, with Intel’s backing, many more OEMs and monitor makers will hopefully jump in the variable refresh rate bandwagon. Then we can see really widespread option of what flat-panel monitors should have supported in the first place!

    • wingless
    • 4 years ago

    As the owner of a G-sync display, and as of this upcoming weekend, a 2nd G-sync display (from a friend), I’m EXTREMELY excited for the rest of you to experience this level of tech. The market’s fast adoption of Freesync is fantastic news for us all. Hopefully Nvidia will wisen up and support it as well.

    Congratulations to you all for finally getting availability of this tech.

    P.S.: Get a 120hz display. It’s [computer] life changing.

      • anotherengineer
      • 4 years ago

      “P.S.: Get a 120hz display. It’s [computer] life changing.”

      I did about 6 years ago. And yes for CS:Source it was way better than the 19″ viewsonic VP930b I had. Especially since I could drive it at 120+ fps all the time also.

    • southrncomfortjm
    • 4 years ago

    I know Intel GPUs are the most common according to the Steam Survey, but will most people who rely on that really care enough to get a Freesync display to use the tech?

    I mean, even as an Nvidia GPU owner I hope so, since it will help drive the open format.

    EDIT – I guess the real play here is to get Freesync in every non-G-Sync display such that it becomes ubiquitous. If so, awesome.

      • jihadjoe
      • 4 years ago

      I imagine their actual marketshare is even bigger than what Steam indicates, because Steam is already skewed towards gamers.

        • Milo Burke
        • 4 years ago

        True, but that’s not really relevant. I don’t need adaptive sync for spreadsheets at work. Unless the goal is to globally reduce power usage.

      • poisonsnak
      • 4 years ago

      nVidia has 52%, Intel has 20%, and AMD has 27% according to the survey.

      Maybe you were thinking of the most popular single GPU which is the Intel HD 4000 at 6.32%

      I agree with you that it isn’t going to matter much what Intel does here. Who is going to spring for a high end display with Freesync if all they have to drive it is onboard graphics? Maybe if one day every monitor out there has Freesync…

        • southrncomfortjm
        • 4 years ago

        Maybe – good catch. I may have mixed the survey up with my thought that anyone who doesn’t have a discrete GPU, but as an intel CPU and plays any number of casual games would push intel’s numbers higher.

        • VincentHanna
        • 4 years ago

        Quite the pickle.
        If you have a good, dedicated GPU that gives you good FPS, you don’t need adaptive sync. Little or no benefit beyond 100fps.

        If you have a cheap, 100 dollar monitor, and *maybe* a $250-$300 CPU package, you are actually likely to see substantial benefits from getting freesync… But of course, if you were willing to pay for it, you probably would already have a dedicated GPU. Adaptive sync on/off makes a big difference. Going from 18fps to 40fps makes a bigger one.

      • UberGerbil
      • 4 years ago

      We may see some laptop displays adopt it. At typical laptop resolutions, a Skylake with eDRAM could be a reasonable gaming machine for many titles (most titles, actually, if you’re willing to live with some of the detail/complexity sliders turned down). Add adaptive synch and such a laptop might actually be a [i<]desirable[/i<] gaming platform, particularly since it wouldn't (or at least shouldn't) cost anything near what dedicated "gaming" laptops have in the past.

        • southrncomfortjm
        • 4 years ago

        I’d love to see it. I’d love a laptop that is capable of delivering playable frame rates and some decent detail on a game like XCOM 2 without having to shell out $1000 for a system with a 950 or 960M.

        Ideally, I’d like a Yoga style laptop in the 13.3-14inch inch range with some mild gaming chops for turn based games that costs less than $700.

        • VincentHanna
        • 4 years ago

        [quote<]We may see some laptop displays adopt it.[/quote<] Laptop displays already have it.

    • Chrispy_
    • 4 years ago

    Huzzah!
    Finally the nail in the coffin that destroys G-Sync and forces Nvidia to use an open standard.

    Before people say that Intel graphics don’t matter, remember that Nvidia’s entire laptop lineup is utterly and inseperably mated to Intel Graphics via Optimus.

    Another victory for open standards over Nvidia’s proprietary forced market segmentation.

      • Ninjitsu
      • 4 years ago

      Their laptop G-sync is pretty much Adaptive Sync too.

      • Aegaeon
      • 4 years ago

      Right now G-Sync and Optimus are mutually exclusive.

      Mobile G-Sync requires a direct eDP connection from the discrete GPU to the panel, so no routing through the IGP like with Optimus.

      There is also the extra Nvidia license fee required to qualify the panel for G-Sync and enable it in the BIOS before the driver will activate it.

    • steelcity_ballin
    • 4 years ago

    I CAN’T WAIT TO SEE MICROSOFT EXCEL WITHOUT SCREEN TEARS. AT LONG LAST MY INTEGRATED-GRAPHICS DESKTOP APPS NIGHTMARE IS OVER.

    … but seriously, what am I missing? What advantages would really be noticeable running an integrated gpu that’s not going to even be capable of rendering the framerates in the first place?

      • jihadjoe
      • 4 years ago

      Adaptive refresh is actually at its best when paried with a GPU that isn’t capable of driving frames at the screen’s refresh rate.

    • Ninjitsu
    • 4 years ago

    Surprising, since I’d assume Intel hardware already supports variable refresh rates over eDP, which was originally supposed to be a power saving measure.

    • CityEater
    • 4 years ago

    This is a silly question but does it mean that you could run Freesync on a Nvidia GPU by running the monitor out of the onboard motherboard DP?
    Is driver support needed in that respect?
    If its a VESA standard how long before it becomes the base spec in any monitor?

    • YukaKun
    • 4 years ago

    It’s interesting to see Intel taking a “wait and see” approach. Them not committing to either says they even might have something in the works internally. That “embrace standards” is just BS, otherwise Thunderbolt would not exist competing with USB3.

    In any case, that implies they’re not fully convinced FreeSync will triumph GSync, the other way around or any of the two making any significant impact (i.e. increased revenue) in the market. Which, I think they’re right. Currently Intel doesn’t care about boutique sales, I’d say, so investing in supporting extra features for the sake of it, makes sense to be in a far away scope.

    Well, that is just my simple interpretation, but I might not be so off the mark.

    Cheers!

      • renz496
      • 4 years ago

      Lol that’s why at one point i do think that instead of adopting into VESA Adaptive Sync Intel might coming up with their solution using Thunderbolt.

      • AJSB
      • 4 years ago

      Does by any chance you have a G-Sync monitor ? I almost bet you do :”)

      …or else I think you missed that part where it says, and i quote:

      “Blythe indicated that Intel is positively inclined toward standards-based solutions LIKE ADAPTIVE-SYNC, and he said INTEL DOES INDEED PLAN TO SUPPORT THIS optional extension to the DISPLAYPORT spec.”

      There is NO “wait and see” attitude AT ALL, simply current Intel CPUs iGPU hardware can’t handle Async.

      …now you can put back your NVIDIA Sunglasses again :))

        • YukaKun
        • 4 years ago

        It would be interesting to know how you can interpret or infer that I am “pro nVidia” by what I wrote.

        In any case, I’ll give you that he is not a PR fella saying stuff in Twitter and his words have a lot more weight, but I bet he can’t make that call by himself either. Until he says “we are working on that at the moment”, you can’t be sure they will actually do it. You have to think that the spec has been available for a while now, but they are still “thinking” on putting it into their hardware.

        Cheers!

    • Firestarter
    • 4 years ago

    I want to know how Nvidia reacts to this because buying into a technology that could very well become obsolete in the near future is something I would like to avoid

      • renz496
      • 4 years ago

      Add more features to GSync? Recently they bring new improvement to Gsync so games in window mode able to use Gsync.

    • Krogoth
    • 4 years ago

    If there is any vendor who can standardize adaptive sync. It is Intel.

    They have the marketshare to pull it off and give monitor vendors an incentive to throw in adaptive sync as an extra feature on their monitor line-up instead of being “niche/premium” only.

      • HisDivineOrder
      • 4 years ago

      Looks like it won’t be anytime soon, though, unless Intel can work some magic and make it so Skylake supports it.

    • AJSB
    • 4 years ago

    VHS vs Betamax Deja Vu War.

    I can be considered sa AMD fanboy but i concede that there was a review where it seems players favourited G-Sync…but doesn’t matter, FreeSync, AKA Async, will win, resistance is futile >:>

    This is just like the VS vs Betamax wars….we all knew that Betamax had superior image quality,etc. but VHS was supported by movie industry with a flood of movie titles, Betamax wasn’t supported by movie industry and was destroyed in no much time.

    Now with the backing from Intel, as soon they start to shove a iGPU FreeSync -compatible in every of their CPUs , we can expect monitors OEM flock to FreeSync…and we can expect the lower range of those monitors drop to the 3x or even 2x FPS range.

    The writing is on the wall, G-Sync days are numbered.

    • anotherengineer
    • 4 years ago

    That’s all well and good, but when is the question?? I remember them making the USB 3.0 standard then being one of the last ones to adopt it. Even AMD had USB 3.0 support in their APU chipsets before Intel did, sheeesh.

    Skylake and Broadwell not supporting DP 1.2a or 1.3 is fail.

    • Lans
    • 4 years ago

    YES!

    • Bensam123
    • 4 years ago

    And that’s the end of G-Sync.

      • Airmantharp
      • 4 years ago

      Not so long as it remains (tentatively) the superior solution.

        • Bensam123
        • 4 years ago

        Intel is inside everything, it’s even in your mom!

        Just by association machines will support freesync and monitors will start supporting it because it’s another badge to slap on things. Branding and names mean a lot.

          • travbrad
          • 4 years ago

          Intel is inside almost everything, but none of their stuff currently supports adaptive sync, so it’ll probably be awhile before adaptive sync displays are common place. The OEMs have a part to play too, and they aren’t exactly known for adding stuff if they don’t have to. They typically try to cut costs in every way possible.

            • Bensam123
            • 4 years ago

            Why would it matter if it’s now or the future? Unless Intel plans on supporting Gsync it doesn’t change anything, especially considering people usually look to the future when making tech purchases (what will support this in the future?).

            • travbrad
            • 4 years ago

            [quote<]Why would it matter if it's now or the future?[/quote<] Because I want to buy an adaptive refresh display some time soon. :p For the next couple years at least you are effectively vendor locked no matter which way you go. If you get a freesync monitor you can't buy Nvidia GPUs. If you get a Gsync monitor you can't buy AMD GPUs. In the long run I actually hope the more open standard wins out, but even if it does, that won't happen for YEARS in my opinion. I just can't imagine Nvidia dropping support for Gsync and supporting freesync within the next few years, which means there will be 2 camps (the Nvidia/Gsync camp and the Intel/AMD freesync camp).

        • Mark_GB
        • 4 years ago

        Were you around back when it was Betamax vs VHS? Betamax was clearly the better solution.

        VHS won.

        • TopHatKiller
        • 4 years ago

        There is no ‘superior solution’. You cannot buy ‘g-sync’ or ‘adaptive-no-brand-name’.
        You buy an actual monitor. And both these solutions depend on the implementation and quality of the actual monitor that you use. Why don’t people realise this? Cheers, anyway.

    • UnfriendlyFire
    • 4 years ago

    Well this is going to be an interesting format war.

    AMD and Intel vs Nividia over the adaptive refresh standards. Someone is going to lose, badly.

    (Nividia could choose to support G-Sync and VESA, but that’s duplicated hardware/software, and monitor manufacturers are unlikely to support both standards without “assistance” from Nividia.)

      • TopHatKiller
      • 4 years ago

      Hardly. The only possible outcome is nv admitting defeat, again, as they have done on PhysX.
      “Hung-less”, I suppose could run off into a corner of the playing field and cry for his mummy, but it wouldn’t really help.

        • HisDivineOrder
        • 4 years ago

        They’ve admitted defeat with PhysX? When was that precisely? In my experience, it’s still just as “Oh, there it is again!” as usual. It hits a few games a year.

        Hell, I’d argue that their emphasis for PhysX turned into an emphasis for Gameworks and Gameworks seems to be EVERYWHERE this year. Remember two years ago when AMD had Gaming Evolved getting every awesome game? And nVidia was off playing with Tegra Toys? Then AMD focused on Mantle and let nVidia stealthily upgrade their DX11 driver and…

        …fast forward to today. The benchmarks we always needed now show definitively that nVidia multithreaded and boosted their DX11 driver while AMD let theirs decay to further emphasize their Mantle boondoggle.

        A perceived advantage gave nVidia more marketshare. A perceived increase in marketshare suddenly made it seem alright to publishers and developers to take nVidia’s (marketing and support) money because there was a huge enough bloc of users that would benefit from Gameworks (supposedly).

        And just like that, AMD’s locked out of high end performance mostly due to their obviously neglected driver for DX11. DX11 ain’t going anywhere, btw. It’ll still be the favored API for a LOT of games, especially for the next two years (games developed pre-Windows 10’s launch).

        So if anything, PhysX did not grow and it did not shrink. It’s still used in some select titles as usual. Now, though, it’s bound to the seemingly more prevalent Gameworks…

          • TopHatKiller
          • 4 years ago

          Sorry. Shoulda’een more clear. Obviously Nv didn’t ‘admit defeat’, Intel hasn’t admitted defeat for Itanium… ad nauseam

        • maxxcool
        • 4 years ago

        I see 4:1 physx VS havok titles … However ‘nvidiased’ hardware monitors was a dipstick thing to do in my book. And as someone eles pointed out… NV now gets to support mantle indirectly so that’s got burn the short and curlies…

      • VincentHanna
      • 4 years ago

      Theoretically, Gsync should win handily. Not only is their implementation technically superior, but they own 75% of the market, and the so-called “open standard” doesn’t offer any compensatory benefits like backward compatibility, cross compatibility or reduced cost over the proprietary one.

      Anyone who goes to freesync/g-sync will ultimately be forced to select a camp, unless Nvidia allows their stuff play with the other guy’s… not many people are willing to bet on AMD/Freesync going 6+ years into the future.

        • BobbinThreadbare
        • 4 years ago

        freesync monitors are much cheaper than G-sync

          • HisDivineOrder
          • 4 years ago

          They are. I wish I could say there’s no reason for that, but so far Gsync is worth more. If you don’t believe me, go read the reviews and see for yourself.

            • rxc6
            • 4 years ago

            Is it 100-200 dollars better? I doubt it.

            • Firestarter
            • 4 years ago

            that 100-200 dollars is better invested in the GPU

            • September
            • 4 years ago

            It should end up less than $100 by next year and it makes *every* GPU better. If you can drop the money on the GPU you can invest in a screen that will last twice as long.

            Edit: Grammar still eludes me.

            • Firestarter
            • 4 years ago

            Every GPU? I thought it was only 1 of the 3 GPU manufacturers that support G-Sync, whereas the cheaper option will soon be supported by 2 out of 3. What is that extra $100 supposed to do again?

        • UnfriendlyFire
        • 4 years ago

        Except AMD has Intel’s backing, and since Intel has a near monopoly on the CPU market and their decision to shove an IGP into every consumer CPU, Nividia is going to have a major fight.

        Sony won the Blu-Ray vs. HD-DVD war after they installed a Blu-Ray player in every PS3.

        The hardware was so subsidized (by sales of games) that some researchers would buy PS3s to strip them of the blue laser readers.

          • NoOne ButMe
          • 4 years ago

          Buy a PS3 for $600 and get a gaming machine, etc. OR buy a blue ray player for $1000. Choices, choices.

            • NeelyCam
            • 4 years ago

            I remember those times..

          • VincentHanna
          • 4 years ago

          In the event that free-sync becomes actually free, and is included in a $120 monitor, yeah, that analogy would work. As it is now, you have to go out and buy a totally niche product, a $600 monitor either way… that’s the point. You have to invest to go grab this tech.

            • ChangWang
            • 4 years ago

            Today, yes. In the future, not so much.

            One thing that you aren’t accounting for is the fact that the open standard tends to become a commodity. All the scalar manufacturers are already supporting DP 1.2a in some of their products. At some point it’s going to make more financial sense to phase out the older products that don’t support 1.2a. Same can be said with the panel manufacturers. When every scalar and panel coming out of a factory supports the standard, even those low end $120 displays will have some sort of adaptive sync range.

          • HisDivineOrder
          • 4 years ago

          They’ll have a major fight… in what? Two years? Three? Because we’ll be on the Skylake generation this year, next year, and a lot of the year after?

          I’m not even sure AMD’ll be around by then if things keep going the way they’re going.

            • _ppi
            • 4 years ago

            AMD Radeon may be bought out by someone and developed further. I guess Intel wouldn’t mind to grab it from the bankruptcy heap.

        • GrimDanfango
        • 4 years ago

        75% of almost-nothing is almost-nothing… what does the market share matter at this point?

        I actually get the feeling that monitor manufacturers will simply end up offering multi-format monitors, with g-sync *and* freesync built in… and then cheaper monitors will just come with freesync as standard. So g-sync will simply end up the niche-case, “premium” product, and nVidia by their own design will force themselves to limit their adaptive sync offering *exclusively* to that niche, premium market, as they can’t really contrive a way to support freesync for the budget-end of the market, without instantly destroying their entire own market segment.

        If nVidia keep up their general GPU superiority, that situation could probably endure for an indefinite amount of time, but if there comes a point where nVidia end up temporarily on the back foot, g-sync could easily collapse, especially if the freesync standard has evolved to be essentially as good as g-sync by that point.

        Edit: The other thing that I could imagine happening a couple of years down the line – a 3rd party could create an external displayport adapter (or something) that converts whatever g-sync signalling data to drive a freesync-only monitor. That would probably shake things up in a big way.

          • VincentHanna
          • 4 years ago

          [quote<]75% of almost-nothing is almost-nothing...[/quote<] 75% of a lot is a lot. 75% of milk is milk. All Tautologies are True. What's your point? That niche products don't exist? That AMD/Intel's $500-700 monitors aren't somehow just as niche as Nvidia's? [quote<]I actually get the feeling that monitor manufacturers will simply end up offering multi-format monitors, with g-sync *and* freesync built in[/quote<] Possibly... But only if it was cost effective to do that. As of right now, that would be approx $300 of wasted parts because the two systems are totally not cross compatible. [quote<]and then cheaper monitors will just come with freesync as standard. [/quote<] The cheaper monitors might not even adopt displayport as standard, let alone freesync. I have seen no evidence that a massive free-sync tool up that will dilute the market and become standard outside of the small niche of people willing to pay for it (eg videogamers.) It *might* become standard across 144hz 1440+ monitors. *might*.

            • GrimDanfango
            • 4 years ago

            [quote<]What's your point? That niche products don't exist? That AMD/Intel's $500-700 monitors aren't somehow just as niche as Nvidia's?[/quote<] My point is that it's pointless to quote percentages when a majority counts for nothing - as you rightly point out, both sides are currently selling niche products in niche quantities. A 75% market share of a tiny market it a share that could flip almost overnight for any number of reasons. Neither side are anywhere close to having this "in the bag". My understanding was that freesync (or plainly labeled "adaptive sync") is due to be rolled into the displayport standard at some point soon... at which point AMD support would just "come free" with any new monitor supporting that standard. If a select few of those new monitors also included the added premium components to support g-sync, you'd have multi-format monitors - bought by people exclusively for g-sync support, but not tying them to the green team if they feel like a change, and all the rest would only support AMD's adaptive sync tech, as it would just be part of the standard, while nVidia would have to actively avoid supporting that standard, as to do otherwise would instantly and entirely cannibalize their own premium segment. Ultimately, nVidia seem to be ahead by a nose at this point, with a more robust technology, and more products in peoples' hands... but they've also proprietary-teched themselves into a corner should things not go *precisely* to plan over the next couple of years.

        • BestJinjo
        • 4 years ago

        G-Sync is not technically superior. It depends on how the gamer is playing the game.

        Linus already investigated this objectively:
        [url<]https://www.youtube.com/watch?v=MzHxhjcE0eQ[/url<] Since FreeSync costs less, is an industry open standard, and has a range of 9-240Hz, technically it's actually superior to GSync overall since more gamers can take advantage of this technology since the ultimate price of a new monitor with ASync support is cheaper. If monitors support both FreeSync and GSync options and the same for NV cards, the consumer would never have to think twice about upgrading to a new monitor but since NV wants to go the proprietary route to lock consumers in and make profits off them, this creates a major divide in the industry which ultimately hurts a lot of PC gamers who aren't loyal to any particular GPU brand.

          • VincentHanna
          • 4 years ago

          G-sync was introduced to combat things like screen tearing and artifacts introduced by Vertical Sync, and to improve midrange performance between 30hz and 75hz. I said that Gsync is technically better because it does that without smearing your picture across the screen and introducing new different artifacting, thereby negating its intended purpose.

          Even if I accept your claim that linus’ video objectively tested an aspect of adaptive sync, and not other factors related to the rig that has nothing to do with the technology…I’m still not sure I would buy your conclusion that “it depends on how you use it”

          • AJSB
          • 4 years ago

          Sadly 9-240Hz is only theoretical, at least for now…i would love to see and would buy a 20-144Hz FreeSync monitor.

            • dragontamer5788
            • 4 years ago

            [quote<]i would love to see and would buy a 20-144Hz FreeSync monitor.[/quote<] The best I've seen is 30Hz - 144Hz. [url<]http://www.anandtech.com/show/9530/nixeus-nxvue24a-144hz-freesync-monitor-set-to-ship[/url<] Honestly though, the 45-75Hz range I got has been doing fine. I'd like to go up to 144Hz, but even a R9 290x isn't pushing that out consistently.

            • AJSB
            • 4 years ago

            …and also to be honest, 30-144 range would be OK for me (and there is already some 30-144 and 35-144) but i want perfection 😀

        • TrptJim
        • 4 years ago

        If Gsync were free for anyone to use and did not require a proprietary module inside the monitor itself, sure it could win handily. It is totally possible to support Freesync with a firmware update on your monitor or HDTV, if the manufacturer chose to do so. There’s 42″ 4k HDTVs from Korea that now have Freesync firmware. This is completely impossible with GSync. There’s no barrier to entry when it comes to Freesync, which is why you’re seeing so many pop up at prices hundreds of dollars cheaper than an equivalent GSync version.

        • maxxcool
        • 4 years ago

        ummm Intel owns 75% of the GPU market.

      • DPete27
      • 4 years ago

      Everyone (including Nvidia) knows that G-Sync is a dead end. I think Nvidia is putting out better GPUs these days, but I refuse to buy another Nvidia GPU until they get their heads out of their @$$3$ and support VESA adaptive-sync. If they know what’s good for them, they’ll add support with Pascal.

    • Damage
    • 4 years ago

    I’ve updated this story with stronger wording to better indicate what David Blythe told me: Intel definitely intends to support VESA Adaptive-Sync displays.

      • TopHatKiller
      • 4 years ago

      Downvoted that. Uhm. No reason at all to do that, other than an immediate knee-jerk reaction against authority. Of Any Kind. Sorry.

        • Firestarter
        • 4 years ago

        4 days, 1 hour and 26 minutes

        Worst. Knee-jerk. [i<]Ever.[/i<]

          • TopHatKiller
          • 4 years ago

          Occasionally I do other things then read TR posts…. Uhm, occasionally.

        • NeelyCam
        • 4 years ago

        lolwut?

          • TopHatKiller
          • 4 years ago

          I think it’s meant to be surrealistic humour. I wouldn’t really know, though.

      • chuckula
      • 4 years ago

      [quote<]I've updated this story with stronger wording[/quote<] Yes, but the swear-word filters took the stronger wording right out!

    • christos_thski
    • 4 years ago

    When I first heard about the two adaptive sync standards, I was enthused that they might lower tearing and stutters on low framerates (ie around 25-40), but all the reviews and articles I have seen so far are almost exclusively focused on high framerates above 50 or thereabouts. I’m sure it will make for a nice checkbox and all, but intel’s integrated gpus have nowhere near the performance to reach these framerates on mid-high detail, so what’s the point? Are users expected to choose low quality with adaptive sync over medium or high without its benefits, running on intel’s igpus? Or am I missing something? (I’d much rather have an explanation on what I am missing along with the downvote, thankyouverymuch ; perhaps my understanding of adaptive sync is not correct 😉 )

      • nanoflower
      • 4 years ago

      There’s nothing about Intel’s solutions that prevents them from achieving high frame rates in many games. They just won’t do it in the triple A graphic heavy games, but in lighter games like Torchlight a user can benefit from high frame rates and adaptive sync if Intel supports it.

      • NoOne ButMe
      • 4 years ago

      I imagine Intel will be pushing for panels in the lower stated range of Async. If memory serves, Async standard currently supporst 9-240Hz theoretical.

      I believe Intel will push for 9-60Hz displays primarily in notebooks. Well, as low as they can make the refresh rate go. I imagine past a certain point (Seens to be 25-35Hz) it is very hard to go lower.

        • renz496
        • 4 years ago

        Not sure about adaptive sync i think AMD slides mention it is their FreeSync solution that specifically can do 9hz-240hz. Anyone can create or define their own spec but it the end it is still bind by physic law. Right now the minimum for any gsync or adaptive sync most likely at 30hz. To make the it work at much lower frequency panel maker need to develop new panel that did not work like current panels are.

          • NoOne ButMe
          • 4 years ago

          Given how AMD acts, I would imagine that 9-240Hz is ripped right off of the Async current standards. I imagine the refresh rate can shift, but, currently it is “only” 9-240Hz.

      • willyolio
      • 4 years ago

      tech is new and therefore more expensive. expensive things are generally targeted towards high-end to justify expense. high-end gamers are worried about framerates dipping below 60fps. If they were around 30fps it would be unacceptable.

      also, justifying a $300 monitor purchase to improve “smoothness” when using your current monitor and buying a better GPU is a hard sell.

      • Milo Burke
      • 4 years ago

      Upon initial review, Scott said something along the lines of “40 fps is the new 60”. It absolutely offers benefit below 60 fps. In fact, the lower the frame rates go, the more benefit it offers.

      However, monitors seem to be implementing a floor for how low the frame rate can go before it doesn’t work anymore. If memory serves me, an early G-Sync display had a floor of 30 fps, and an early FreeSync had a floor of 40 fps. 30 seems useful, 40 seems much less useful for the use cases that could benefit most. I’d love to see 20 fps or below as the floor, but I’m sure there are technical hurdles.

      If Intel supports it and/or laptop makers do implement something as low as a 9 fps floor for adaptive sync (per NoOne ButMe’s comment), I think we’ll see some growth in that direction in the typical display for non-loaded gamers.

        • BestJinjo
        • 4 years ago

        The problem isn’t the Adaptive-Sync tech but the monitors themselves since monitor manufacturers have to spend more/invest more $ into higher end panels that offer a wider range of refresh rate support. We are already starting to see 30-144Hz FreeSync monitors:
        [url<]http://slickdeals.net/f/8048225-24-nixeus-vue-nx-vue24a-1080p-144hz-freesync-gaming-monitor-pre-order-297-free-shipping[/url<] The FreeSync / Adaptive-Sync technology technically has a much wider range than G-Sync, spanning 9-240Hz.

          • renz496
          • 4 years ago

          I think Nvidia have no problem supporting wider range. They go with what’s possible with current tech limitation. Because is there any panel that faster than 144hz? As for the minimum the limit seems to be 30hz until panel maker make new type of display that can go lower than 30hz. Amd specifically mention those range so they have a point why FS is better than GSync while in reality those numbers were impossible with current panel tech.

        • ChangWang
        • 4 years ago

        With Freesync, AMD has stated that the technology itself can go as low as 9hz (or 9fps). The range you actually get is dependent on the scaler/panel combination that the monitor vendor uses, which is why the sync ranged vary from display to display.

    • BobbinThreadbare
    • 4 years ago

    In current freesync or g-sync implementations does it lower the refresh rate to match video sources? For example running at 48hz to match 24hz movies?

    Is it possible to do this?

      • Zoomastigophora
      • 4 years ago

      I’m guessing if you use madVR with fullscreen exclusive mode, this will work automatically.

        • brucethemoose
        • 4 years ago

        AFAIK MadVR uses the display’s native refresh rate, and just blends frames together. SVP does the same thing with a lot of extra processing.

        There IS some software that automatically changes the display’s refresh rate to match the source, even without freesync… But I can’t remember what it was called, and my Google-fu is failing me :/

        Also, Kodi seems to change the display’s refresh rate on platforms other than Windows. But I haven’t tried it in Windows myself.

          • meerkt
          • 4 years ago

          I think changing the refresh rate for fullscreen is quite a common feature. It’s in MPC-HC, for example. You can manually configure different refresh rates for different ranges of frame rates.

          BTW, the static frame rate of movies also solves the problem of dynamic pixel override compensation.

      • Andrew Lauritzen
      • 4 years ago

      This actually came before Gsync/freesync and was supported out of the box in Win8 and associated hardware (at least on many laptops w/ eDP, although probably not desktops).

        • chuckula
        • 4 years ago

        IT’S YOU! YOU SURVIVED IDF!!

        Hey, having read the slides on the Gen 9 GPU architecture, I really have to give you guys mad props. The CPU side might be Krogothed, but the GPU side is doing very well indeed.

          • Andrew Lauritzen
          • 4 years ago

          Not yet, one more day!

          Glad you liked the slides so far for Gen9 – just wait for my presentation tomorrow for some more 🙂

            • Ninjitsu
            • 4 years ago

            Hey, you guys should really consider the dGPU market at this point. Gen9 seems similar to Maxwell in raw compute at least (and much higher FP64).

            • USAFTW
            • 4 years ago

            This. I know the chances of this happening are really slim, but a big Intel dGPU with more feature-packed drivers on 14nm or 22nm would be killer. Nvidia needs to reconsider it’s pricing on it’s Gx-x04 and Gx-x00/x10 chips.

            • Firestarter
            • 4 years ago

            I want an AMD or Nvidia GPU on Intel 14nm, can you imagine?

            • renz496
            • 4 years ago

            They just need to priced it like Fermi. 200-250 for Gx-x04 part. But doing so probably going to hit AMD much harder than it is right now. If nvidia push further with their pricing then we might see this lol:

            Intel: low end to mid range
            Nvidia: upper mid range to ultra high end
            AMD: buy console

            XD

            • Ninjitsu
            • 4 years ago

            If they made a 500mm2 part and priced it like Fermi, Nvidia will be forced to slash prices on everything from GeForce to Tesla. Intel Gen9 has far more FP64 than Maxwell, and FP32 is at par as well. Efficiency wise Intel is as focused as Nvidia, and they have a foundry advantage. So the competitive scenario this will create will be awesome for us.

            Nvidia of course have the driver and developer base, and Intel will have to push the image of a GPU brand.

            But you’re right, it’ll decimate AMD (which is why Intel should buy their graphics division too lol). They could actually solve the branding issue by using the ATI name.

          • Andrew Lauritzen
          • 4 years ago

          Here’s some more slides from our presentation this morning. Audio will likely go up in a ~week or so. Enjoy!

          [url<]https://hubb.blob.core.windows.net/e5888822-986f-45f5-b1d7-08f96e618a7b-published/54f4f27e-62d8-4b7b-8364-fa8f110b1664/GVCS004%20-%20SF15_GVCS004_100f.pdf?sv=2014-02-14&sr=c&sig=Cv7l%2FgyeEHCeyeBY%2B26YNU%2Bbhh2HgcazoGBTkobMU10%3D&se=2015-08-21T18%3A15%3A08Z&sp=rwd[/url<]

            • chuckula
            • 4 years ago

            w00t!

            [Edit: OK, any deck of slides that uses the term “Sparse voxel octrees” is a winner for me, even if I have no freakin’ clue what they are!]

      • brucethemoose
      • 4 years ago

      It is possible (looking for the software atm, I used to use it but can’t remember the name), but it has nothing to do with freesync or gsync.

      However, using SVP + Reclock to generate extra frames looks better to me. You can tone down the “interpolation” effect if it bothers you, while still blending frames together for smoothness.

      • webs0r
      • 4 years ago

      Not really. The feasibility is under question, due to the lack of an accurate clock source to drive/exactly time the software-initiated push of each frame.
      vs.
      with games, timing doesn’t matter, the software just sends out each frame whenever it is ready.

      I think there was a view that this could be possible but would require driver support, assuming that the video card itself has an accurate custom clock source.

    • NoOne ButMe
    • 4 years ago

    yay! Also, remember this will increase battery life! =]

      • ImSpartacus
      • 4 years ago

      If that isn’t convincing for Intel, I don’t know what would be.

      • torquer
      • 4 years ago

      Remember that mobile display technology differs significantly from the desktop.

    • xeridea
    • 4 years ago

    And then Nvidia can quit its childish methods of not supporting open standards to try to vendor lock people into their own stuff. They said they would not support Mantle or Freesync, but Mantle is now Vulkan, and they will be forced to support Freesync when Gsync gets outvoted 5 to 1 by chip makers.

      • lycium
      • 4 years ago

      Hear, hear!

      • bthylafh
      • 4 years ago

      Let’s not forget that Gsync was out first by a big margin. I can’t condemn them for backing their horse given that and all the resources it must have taken to get it off the ground.

        • derFunkenstein
        • 4 years ago

        Agreed, but thanks to Adaptive-Sync displays, it’s time to put that horse out to pasture.

          • nanoflower
          • 4 years ago

          I wouldn’t go that far. At least not if you are suggesting Nvidia drop support for G-Sync. Given the comments Nvidia has made suggesting G-Sync has real advantages over Adapative Sync I see no problem with Nvidia continuing to support and develop G-Sync. So long as they also support Adaptive Sync on the same hardware. That way Nvidia users get the advantages whether they have a monitor supporting G-Sync or Adapative Sync.

            • derFunkenstein
            • 4 years ago

            Not drop support (because that would suck for their early adopters), but at least they need to add support for Adaptive-Sync displays.

            • Airmantharp
            • 4 years ago

            Agreed. G-Sync still appears for now to be the superior technology, and I’d prefer that they continue to develop (and of course support) it, but they should also support Adaptive Sync displays as well.

            • Andrew Lauritzen
            • 4 years ago

            Yeah they need to do it both ways for me to be willing to take the plunge: i.e. NVIDIA GPUs support adaptive sync displays *and* Gsync module/monitors support freesync.

            • derFunkenstein
            • 4 years ago

            If G-Sync is really better than Adaptive-Sync, then you’re probably right; building something better and charging for it is fine. It’s not supporting the open standard alongside that bugs me. I’m personally not likely to buy a G-Sync monitor, but down the road given the modest price premium I could see buying an Adaptive-Sync. If whatever video card I’m using at the time supports it, that is.

            • DPete27
            • 4 years ago

            Of course Nvidia would say their solution is better…

          • the
          • 4 years ago

          Indeed. AMD has quietly dropped Mantle in favor the Vulkan standard even though they started the trend toward low level APIs on the PC. Similarly, nVidia should acknowledge that the VESA standard is the well [i<]the standard[/i<] now and support it. If they feel that the advantages of G-Sync are display still worth supporting over VESA adaptive sync, then by all means continue but don't play proprietary game. Let Geforce hardware work with adaptive sync displays and let G-sync displays work in an adaptive sync mode.

            • Ninjitsu
            • 4 years ago

            AMD didn’t “start the trend”. By all indications they simply were the first to start yelling about it. They dropped Mantle because they didn’t really have the resources to support it long term, and DX12 would make the effort pointless.

        • ChangWang
        • 4 years ago

        As a consumer, their investments and resources mean little to me if I’m paying more for a proprietary product that’s marginally “better” than the standards backed solution.

          • Vaughn
          • 4 years ago

          Abit off topic sorry

          Reminds me of the current issues with Taxi’s and uber.

          The taxi’s complain about their infrastructure and how uber is cheaper because they don’t have to deal with all the extra.

          When I’m riding in your cab I don’t really give damn about government regulation and all your fees and your cost of doing business I just care if you can get me from A to B at a good price.

          So why do I care what NV’s spent on making Gsync etc.

            • Nevermind
            • 4 years ago

            [url<]https://en.wikipedia.org/wiki/Disruptive_technology[/url<]

          • NoOne ButMe
          • 4 years ago

          Very well put.

          • ludi
          • 4 years ago

          So would you buy the non-proprietary, standards-backed solution if it means not buying Nvidia?

          • Ninjitsu
          • 4 years ago

          I really hope you don’t pay for Apple products, then?

            • ChangWang
            • 4 years ago

            I don’t. I’m not a fan of their stuff for the very same reasons.

        • TopHatKiller
        • 4 years ago

        Nonsense. “G-sync” was piggy-backed on embedded dp1.2 standard. Nv engineers realised how they could turn that standard into a frame-paced instrument, then, one imagines, “hung-less” and crew thought of ways to make it proprietary.
        The only thing nv was first in was spotting the advantages of the embedded tech. Good for them, on that though – AMD didn’t.*
        *that actually is why amd were able to respond to g-synch so quickly – and then dither and fail to introduce their own free version for so long – amd quickly spotted where g-sync had come from, but then took a ridiculous amount of time signing up vesa and partners.

        • xeridea
        • 4 years ago

        The issue is that they wanted to pore money into a proprietary chip that would cost consumers more in the end rather than the much simpler approach with AdaptiveSync. They chose to attempt to monopolize the idea by throwing money at it so I don’t feel bad for them.

        • willmore
        • 4 years ago

        Let’s see, VESA working groups discuss variable refresh. AMD stays and works with the group and eventually it ends up in a released standard and AMD supports it.

        nVidia, takes the early working group efforts and starts a skunkworks project to implement it their way before VESA can release it. But, it ends up being incompatable with the standard.

        Which company did the best thing for the consumer?

      • odizzido
      • 4 years ago

      Don’t underestimate Nvidia’s potential to be dinks.

      • Duct Tape Dude
      • 4 years ago

      [b<]2017[/b<]: Nvidia invents a new proprietary cable that goes straight from PCIe to a new proprietary port on a proprietary monitor, which has not only a proprietary GSync chip, but a full GTX graphics card which supports DX12 by translating calls into DX11.5. They dub it [b<]DisplayWorks[/b<].

        • chuckula
        • 4 years ago

        You forgot the required proprietary eye implants that depolarize the proprietary photons from the monitor!

        • travbrad
        • 4 years ago

        Are they going to change the name of their company to Apple too?

          • USAFTW
          • 4 years ago

          Lime would work better.

      • MFergus
      • 4 years ago

      To be fair they never had a chance to support Mantle anyways since it was never open but of course that all changed when it was given to Khronos and became Vulkan.

        • renz496
        • 4 years ago

        AFAIK when AMD comes up with Mantle Nvidia did not say anything about it. It is quite sometime before they publically mention they have no intention of supporting Mantle. But AMD was not very clear about opening Mantle in the first place. Intel has been asking for the spec from the very beginning and AMD keep giving ‘beta’ excuse while in the mean time intel somehow able to make working drivers for DX12. Vulkan while a step in right direction for open source 3D API i’m not sure if it will change anything on windows based machine. On mobile Apple have their own Metal API. For android even if google going to support Vulcan adoption among game dev might not that strong for new API.

      • TopHatKiller
      • 4 years ago

      Finally. Good sense.

      • HisDivineOrder
      • 4 years ago

      That will happen the very day Intel announces actual support for shipping products. In the meantime, it’s AMD’s marketshare versus nVidia’s and, well, that’s not looking so hot for AMD atm.

      It doesn’t hurt that Gsync is actually more capable and monitor makers love anything that gives them an excuse to charge a premium for what they want to call a premium product. If nVidia charges a little more for that board, the monitor maker’ll ramp up that cost not just to compensate, but also to take advantage of the premium nature of the product.

      And if nVidia has proven anything with the Great Titan experiment, it’s that they can pull off charging more for something just a LITTLE more capable and it will sell to a select group of high end users. Especially when their little halo cards prove more than able at convincing users buying $650+ cards that they’re actually getting a deal versus those Titans…

      That guy who just paid $650-750 for a Geforce 980 Ti is going to be bargain shopping for monitors? Nah. I don’t think so. And do most monitor makers want to be the company selling the new high end monitor or the cheapest of the cheap?

      Most seem to want to be selling the big dog. And the big dog goes to…? The same kinda guy who just bought that $650 980 Ti because he convinced his wife that it’s a steal over the $1k+ Titan it performs a little better than in many overclocked scenarios.

      • plonk420
      • 4 years ago

      while almost all of my flagship GPUs have been AMDs*, i must point out that Nvidia are outselling AMD 5 to 1. so i don’t think there is pressure in Freesync’s direction. yet.

      *my Ti4200 and 6800GT were nearly-flagships

        • xeridea
        • 4 years ago

        not quite 5 to 1 for discrete for Q2 (this doesn’t count APUs). Fury is selling like hotcakes and the Nano is yet to be released. Intel holds the majority of GPU market by volume, though many don’t game that much on the integrated graphics, it is still a large number. There are other uses for variable refresh tech, such as video playback, HTPC etc. Point being Intel supporting AdaptiveSync can’t be overlooked by Nvidia.

      • Deanjo
      • 4 years ago

      And yet you are not whining about MS’s proprietary direct X…

        • chuckula
        • 4 years ago

        [quote<]And then AMD can quit its childish methods of not supporting open standards to try to vendor lock people into their own stuff. They said they would not ever publicize Mantle or support G-sync, but they've been forced to open Mantle as now Vulkan, and they will be forced to support Gsync gets outvoted 5 to 1 by chip makers.[/quote<]

          • September
          • 4 years ago

          lol, nice montage

      • torquer
      • 4 years ago

      You could stop childishly blaming Nvidia for not supporting open standards and instead looking at how these standards often FOLLOWED Nvidia’s proprietary tech. Adaptive Sync did not exist prior to G-Sync. Its hardly fair to get mad at someone for not supporting a standard that doesn’t exist yet.

      Similarly where is your blame when AMD does the same thing? Mantle was not open.

      Standards are always better, so Nvidia should now support those standards that have now been developed, but you’d have more credibility if you were more objective in your statements and less emotional.

        • DoomGuy64
        • 4 years ago

        It’s been mentioned before, but Nvidia stole the idea for adaptive sync from VESA to rush it out before it became a standard. Nvidia could have helped finish the standard instead of making their own, and this fragmentation set adaptive sync adoption back a good ways. Also, Mantle was nothing more than a tech demo to prove what a close to the metal api could do for performance, and it pushed MS to fast track dx12, and then became Vulkan which is open. AMD never intended to make Mantle into Glide. It was an api designed to showcase the potential of GCN.

        Mantle was never meant to become crippleware like Gameworks. It served its purpose, and AMD has moved on. Gameworks is still around and ruining every game Nvidia gets its grubby mitts on, like Batman:AK.

          • torquer
          • 4 years ago

          Can you prove any of that?

Pin It on Pinterest

Share This