FreeSync monitors will sample next month, start selling next year

The Siggraph conference is going down in Vancouver, Canada this week, bringing a bunch of graphics goodness into the back yard of TR's northern outpost. Freshly minted AMD "Gaming Scientist" Richard Huddy spoke at the show, and we sat down with him yesterday to discuss a range of topics that included AMD's "FreeSync" alternative to Nvidia's G-Sync adaptive refresh tech.

The first FreeSync monitors will start sampling as early as next month, Huddy told us, and finished products are due to hit the market early next year. That's a little more precise than the release timeframe AMD mentioned in May.

"Multiple" vendors are preparing displays based on the technology, though Huddy declined to name names. Interestingly, he suggested there's more excitement surrounding adaptive refresh mojo than there is for 4K resolutions. You'll certainly need a lot less graphics horsepower exploit the benefits of a dynamic refresh rate than you will to run games at 4K.

FreeSync is based on an embedded DisplayPort capability that was formally added to version 1.2a of the standard spec. Like the rest of the standard—and unlike G-Sync—this "Adaptive-Sync" feature is royalty-free. There are some associated hardware requirements, but the additional cost should be minimal, according to Huddy, who told us he'd be surprised if FreeSync compatibility added more than $10-20 to a display's bill of materials. Even taking additional validation costs into consideration, monitor makers should be able to support adaptive refresh rates fairly cheaply. They're still free to charge whatever premium they want, though.

There are no requirements surrounding the range of refresh rates that monitor makers must support. However, Huddy expects entry-level models to start at 24Hz, which is the most desirable update frequency for typical video. Higher-end implementations could scale up to 144Hz and beyond.

Some of AMD's current products use cheaper display controllers that won't be compatible with Adaptive-Sync. (A full list of compatible GPUs and APUs is available here.) However, Huddy said all future AMD hardware will support the feature. The firm is evidently committed to the technology, and it will be interesting to see how the finished products compare to equivalent G-Sync solutions. We will dutifully subject ourselves to hours of gaming "tests" to get to the bottom of that important question.

Comments closed
    • Airmantharp
    • 5 years ago

    I want to see both FreeSync and G-Sync succeed in the market, and I’m willing to bet that a finalized ASIC with any necessary supporting hardware that supports both technologies wouldn’t cost significantly more to produce in volume than an ASIC etc. that supports one or the other exclusively.

    I’m very happy that we have a competition among industry giants to address an issue that has hampered the video game experience since it’s inception!

    • BanThoseKikes
    • 5 years ago
    • oldog
    • 5 years ago

    Shoot. If I were to appoint a “gaming scientist” at a fictional company that I owned I would name him Mario Nukem.

    Missed marketing opportunity?

    • rgreen83
    • 5 years ago

    What the term “FreeSync” actually is, is a [b<]brand name[/b<] for a spec just as "WiFi" is a brand name for the 802.11 standards or "FireWire" is a brand name for a spec. It would seem pretty silly to be whining that wires were never intended to actually be on fire.

    • Flapdrol
    • 5 years ago

    When next year? January or December?

      • 0x800300AF
      • 5 years ago

      March 14

    • Voldenuit
    • 5 years ago

    [quote<]The AMD Radeonβ„’ R9 295X2, 290X, R9 290, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming.[/quote<] Dang. No love for Tahiti (R9 280, 280X, 7970, 7950). Now I feel sorta bad for recommending the 280 to people as the best upper midrange card for the money.

    • Ninjitsu
    • 5 years ago

    [quote<] Interestingly, he suggested there's more excitement surrounding adaptive refresh mojo than there is for 4K resolutions. [/quote<] As it should be. Combine this with DX12/NG-OGL, and a lower mid-range setup should be able to provide a smooth, tear free gaming experience at 1080p. And when we're finally ready on the content, price and quality side of things for 4K, the same idea will transition over.

    • fredsnotdead
    • 5 years ago

    Any chance this will come to (especially 4k) TVs? Is it even possible with HDMI?

      • mczak
      • 5 years ago

      Pretty sure it isn’t, since with hdmi the pixel clock is directly tied to the hdmi signalling clock.
      (DP otoh transmits the frames packet based thus it isn’t tied to DP clock.)

      • Airmantharp
      • 5 years ago

      Neither FreeSync nor G-Sync is resolution limited, however, it won’t happen over HDMI as HDMI is currently implemented.

      Further, for pre-rendered content like any form of video, or even console games that lock themselves to specific refresh rates, a variable V-Sync solution is really not needed. It wouldn’t hurt to have it available on a TV for other sources like a Steambox, of course.

        • Jason181
        • 5 years ago

        It would eliminate the need for 3:2 pulldown, and any artifacts that introduces.

          • Airmantharp
          • 5 years ago

          And that raises one of my initial questions about variable V-Sync outside of gaming- couldn’t media players take advantage of the technology as well?

          • JustAnEngineer
          • 5 years ago

          That was already achieved with existing DVI and HDMI interfaces. Blu-rays play back at 24 Hz. That’s why 120Hz televisions are the norm – so that they refresh at an even multiple of 24, 30 and 60 Hz, eliminating nasty telecine judder.

            • Jason181
            • 5 years ago

            It’s still and issue with 60 hz monitors and televisions though, correct?

          • DaveBaumann
          • 5 years ago

          This is part of the DP1.2a Adaptive-Sync spec:

          [url<]http://www.vesa.org/wp-content/uploads/2014/07/VESA-Adaptive-Sync-Whitepaper-v1.pdf[/url<]

            • Airmantharp
            • 5 years ago

            Agreed- but here’s a question for Dave or those in the know: is Adaptive Sync mandatory in any DP spec, or will it be?

    • hoboGeek
    • 5 years ago

    My momma always said: “Life is like a FreeSync monitor refresh rate. You never know what you’re gonna get”

    • BitBlaster
    • 5 years ago

    [quote<]You'll certainly need a lot less graphics horsepower exploit the benefits... [/quote<] You'll certainly need a lot less graphics horsepower [u<]to[/u<] exploit the benefits FTFY

    • DPete27
    • 5 years ago

    A 1440p monitor with adaptive refresh….TAKE MY MONEY NOW!!!

    • willmore
    • 5 years ago

    Nice web site you have there, AMD:
    [quote<]It looks like your browser does not have JavaScript enabled. Please turn on JavaScript and try again.[/quote<] You need JS for a freaking list of GPUs? What's it do, dynamically reformat itself or some other useless crap? FFS.

      • willmore
      • 5 years ago

      Jeez, no you don’t need a cookie, either. What mongoloid wrote your web site, AMD? Did you let the marketing department do it?

      • Shambles
      • 5 years ago

      High five, Noscript is the best anti-malware piece of software out there. If I load up a site and it won’t work without cross site scripting I immediately leave it. Just because you can use JS doesn’t mean you should.

      • mertenz
      • 5 years ago

      Nice web site you have there, NVIDIA:
      You need JS for a freaking list of GPUs drivers? What’s it do, dynamically reformat itself or some other useless crap?.FFS.

      ———————————————-
      Nice web site you have there, INTEL:
      You need JS for a freaking list of drivers? What’s it do, dynamically reformat itself or some other useless crap? FFS.

      Welcome to the Internetz

    • SCR250
    • 5 years ago

    FreeSync is not FREE.

    [quote<]There are some associated hardware requirements, but the [b<][u<]additional cost[/u<][/b<] should be minimal, according to Huddy, who told us he'd be surprised if FreeSync compatibility added more than $10-20 to a display's bill of materials. Even taking additional validation costs into consideration, monitor makers should be able to support adaptive refresh rates fairly cheaply. [b<][u<]They're still free to charge whatever premium they want, though[/u<][/b<]. [/quote<]

      • someuid
      • 5 years ago

      FreeSync is a spec, not a product. Since it is royalty free, FreeSync is free. The various part makers who charge extra for the more capable parts that monitor makers are using to enable FreeSync is standard market economics, not a design aspect of FreeSync.

      I’m pretty sure the use of the word free was not in reference to end consumer pricing. It was in reference to freedom of the equipment in the graphics pipeline to pick a more useable refresh rate to enable fluid, smooth video and game play.

        • SCR250
        • 5 years ago

        [quote<]FreeSync is a spec, not a product.[/quote<] How exactly can you play games on a FreeSync spec? Answer: you can't. And if a product that is FreeSync capable costs more than the same product without FreeSync then FreeSync actually costs those who buy it something more than FREE. Hence FreeSync is not FREE.

          • MadManOriginal
          • 5 years ago

          Features aren’t free to end users because they have a price to implement. A spec can be free in that it has no licensing cost. Don’t be pedantic about it.

          • derFunkenstein
          • 5 years ago

          So in your world, USB 3.0 is a product, not a spec.

          *Monitors are products that meet specifications
          *FreeSync is a specification that can be used with monitors
          *Some monitors will implement the FreeSync specification, though they may cost more than other monitors due to component costs

          *Thumb drives are products that meet specifications
          *USB 3.0 is a specification that can be used with thumb drives
          *Some thumb drives will implement USB 3.0 specification, though they may cost more than other thumb drives due to component costs.

          Is it really that hard?

          • Freon
          • 5 years ago

          Free has two meanings in English. Libre: as in free software or free speech, and gratis: like free beer provided at your company picnic.

          These shouldn’t be new concepts for someone who follows tech…

        • Flapdrol
        • 5 years ago

        royalty free?

      • spp85
      • 5 years ago

      Its cheaper than nVidia G-SUCK technology

        • Terra_Nocuus
        • 5 years ago

        Which is why AMD’s marketing team chose [b<]Free[/b<]sync.

        • Klimax
        • 5 years ago

        For supposedly sucky technology, it provides apparently a lot of value… and is already shipping.

      • Ninjitsu
      • 5 years ago

      THAT’S OK.

      • Airmantharp
      • 5 years ago

      Adding FreeSync to a monitor that otherwise would not support it is not free. I’m pretty sure that’s what SCR250 meant, and that’s the truth.

        • Sam125
        • 5 years ago

        SCR250 is purposely misconstruing what free means in this case, which is royalty free.

        So I’d say he’s lying. Either that or he’s being purposely stupid but considering he seems to be a Nvidia fanboy I’d say he’s lying moreso than being stupid. Well as un-stupid as a fanboy can be anyway.

          • Airmantharp
          • 5 years ago

          The moment you start labeling someone a ‘fanboy’ you throw your own perspective into question; compare what SCR250 has said to what spp85/86 has said. I believe that what he is saying when taken generally is true as I posted above, and I agree that he could have avoided some negative feedback if he’d been more specific.

          But seriously, I got what he was saying on the first run.

            • Sam125
            • 5 years ago

            Well, one or two, maybe spp85 and spp86 are two different people, people throwing the term around the term superfluously doesn’t make SCR250 any less so. He’s just not as immature about it as the other user is.

            Yeah, I got what he was saying too and he was being disingenuous at best.

            • Airmantharp
            • 5 years ago

            Taking his comment as a response to the article, I’ll have to agree. He’s generally correct, but it does appear that he had the intent to troll at some level.

      • Meadows
      • 5 years ago

      PaidSync, amirite?

      • chuckula
      • 5 years ago

      See kids, you can call me an anti-AMD shill all you want, but I can’t even come close to that level of fanboyosity.

        • SCR250
        • 5 years ago

        How about we call you a J E R K

      • firagabird
      • 5 years ago

      In the same line of thought and comprehensibility…

      G-Sync is not GEE.

      The above statement makes just as little sense, and has the bonus of not being an outright lie.

        • Terra_Nocuus
        • 5 years ago

        so it doesn’t make people say “Gee whiz”? awww, man…

    • jessterman21
    • 5 years ago

    Very sad that I will have to wait years for this tech to come to a no-frills 60Hz monitor at below $200.

      • jessterman21
      • 5 years ago

      But I will press on until then, downsampling on my 900p monitor πŸ™‚

    • Tristan
    • 5 years ago

    So, whole FreeSync (DP 1.2a) is nonsesne. Early next year, there will be monitors with DP 1.3, with adaptive sync ‘build-in’

      • spp85
      • 5 years ago

      So what?? you still want to stay at DP1.2a next year?? If DP1.3 is a superset of 1.2a what is the problem with it ??

    • Chrispy_
    • 5 years ago

    Since G-Sync proves you don’t need 144Hz for fluidity, we should be seeing some “gaming” monitors using IPS panels at last.

    Perhaps, too, Freesync will encourage monitor manufacturers to not peg IPS screens at 60Hz. Most of them clock up to 90Hz without issue, and it all helps.

      • Zizy
      • 5 years ago

      Is IPS pixel transition fast enough for that? Those 120 and 144 Hz monitors don’t have just fast refresh rates, their pixel transitions are quite quick as well. It doesn’t help if you can run TFT with 1kHz if LC takes 10ms to change state. Also, I game on IPS already, 3x ZR24w. Mostly strategy games, no need for anything ultra fast.

      Anyway, OLED FTW πŸ˜›

        • Chrispy_
        • 5 years ago

        I’d love OLED, but it’s not going to happen πŸ˜›

        Most of the S-IPS or H-IPS panels that are doing the rounds in Korean screens are the LG-Philips 6ms GTG variant and they’ve been tested by various sites with proper gear to have total pixel response (including input lag of around 9ms). That’s not a lot worse than the 4-5ms of the best TN gaming panels; 9ms is still enough for ghost-free 100+ Hz refreshes.

        I think the more important thing for reducing blur – and it’s [i<]blur[/i<] that people usually mean when they talk about ghosting or smearing is to do with constant backlights and something called sample-and-hold blur. Strobing backlights work much better with how our brain percieves moving objects on screen, which is why CRT's feel smoother at 60Hz than some LCD's do at 120Hz.

          • Sam125
          • 5 years ago

          [quote<]I'd love OLED, but it's not going to happen :P[/quote<] Well one can still dream. πŸ™

      • superjawes
      • 5 years ago

      [quote<]G-Sync proves you don't need 144Hz for fluidity[/quote<] That's a little bit true and a little bit unknown. With G-Sync, we're really talking about refresh time instead of rates, so the higher rate translates to better time...in other words, a higher refresh rate means that whole frame gets displayed faster. Now I'm sure that G-Sync at 16.7 ms will be better than 60 Hz constant, but lowering the time to 6.9 ms might add another level of fluidity. Unfortunately, I think all G-Sync monitors have been 144 Hz, which means that "60 Hz G-Sync" is only being simulated. The actual refresh time is still going to be 6.9 ms.

        • Chrispy_
        • 5 years ago

        That’s not quite how G-Sync works.

        If you go back and read the original articles here and elsewhere, you’ll see everyone saying that even 35-40fps feels silky smooth. The beauty of a G-Sync monitor is that if the graphics card is churning out frames at 37fps in a particular scene, you get 37Hz where each image is immediate and temporally accurate. Using Vsync without G-Sync, the framerate is jumping erratically from 30 to 60 fps as some frames make it within 16.6ms, others are queued up for 33.3ms. Not only is the framerate erratic, with a high and perceptible 33.3ms maximum, it’s also showing images that are temporally dated, meaning that depiceted movement within the images doesn’t even correspond correctly to the temporal pattern that they should do; Flickery displays AND jerky motion-tracking.

        Everyone’s threshold of perceived “smooth” or “fluid” framerate is slightly different, but doing pinwheel tests and cine-reel testing I’ve found mine to be about 42-43fps. For me that means that an 85Hz screen used to be a great solution, because even with vsync enabled, a repeated frame at 85Hz still produced an equivalent 42.5fps, which I perceived as fluid.

          • superjawes
          • 5 years ago

          No, that’s not what I’m saying. Yes, G-Sync’s biggest improvement is showing a “true” (or at least truer) FPS, but I’m talking about the time it takes to replace information on the screen. When the monitor receives an update signal, it triggers the refresh, which takes a finite amount of time. A 60 Hz G-Sync monitor wouldn’t be able to update the whole display faster than 16.7 ms, while a 144 Hz G-Sync one can update the image as fast as 6.9 ms.

          What I’m saying is that I wouldn’t discount the refresh time’s impact on smoothness without some lower speed (60 Hz level) samples. I fully admit that this may be a non-issue.

            • Chrispy_
            • 5 years ago

            Oh, you mean the time it takes the image change to scan from top to bottom?
            That’s a non-issue, but I understand your concern, coming from the old analogue scanning rate of CRT’s

            Super high-speed footage of LCD displays shows that the scanning speed is so much faster than the pixel response that the whole screen can essentially be considered as updated instantly, rather than a rolling refresh top to bottom.

            The limitation of ASIC’s these days is more to do with bandwidth than anything else, as I understand it.

            • superjawes
            • 5 years ago

            That’s exactly what I mean, and I’d just like to see it explored, as it could be a determining factor in bringing things to IPS displays.

            • Chrispy_
            • 5 years ago

            Well, with a strobing backlight it’s not going to make any difference, but if you want to see the rolling refresh recorded at 1000fps and read a lot more about it I can highly recommend [url=http://www.tftcentral.co.uk/articles/content/motion_blur.htm<]this link[/url<].

          • beck2448
          • 5 years ago

          Gsync has been demonstrated, is selling, and WORKS. For AMD it’s always what we will do will be awesome. Like their crapfire solution that actually didn’t work for years. Anyone get a refund on that?

            • Chrispy_
            • 5 years ago

            SLI was just as bad for at least half a decade, and had stupid motherboard limitations and licensing costs.

            Nvidia did sort it out before AMD, but only by 12-18 months. Why are you so anti AMD when they represent the underdog fighting for the consumer with more open, flexible, platform-agnostic choices – and usually at lower cost. Do you like paying premiums to have your choices limited?

            • 0x800300AF
            • 5 years ago

            Feel free to let me know when G-Sync works with a single GPU and Surround/Eyefinity + 3D or lightboost..or windowed mode after all your implication that G-Sync is perfect you should be able to provide said examples with a quick google search.. expecting a response in next 15 min..starting NOW…
            edit: 12 min counting
            edit 2: 8 min… tick tock tick tock
            edit 3: .. predictable

      • HisDivineOrder
      • 5 years ago

      I wish. Most of the reason they’ve stuck with TN over IPS is because this is a way for them to sell those crappy TN panels at a premium, often over the cost of the far better IPS or even PVS displays. And people will pay the premium because they’re being called “gaming monitors” and they list that 144hz as a spec.

      Do you really think they’re going to endanger that by muddying the waters with the fact that IPS could easily go up to hz levels that make the 144hz of TN panels irrelevant? They’d have to make them more expensive by far to match the level of markup they’re getting on those el cheapo TN panels they use.

      No. They will keep the product lines stratified to ensure they can mark up TN panels to obscene levels around gaming features like Gsync or Freesync while keeping IPS panels in more expensive monitors that focus on color reproduction for professionals.

      They make more money that way.

        • Chrispy_
        • 5 years ago

        It’s about maximising profits, isn’t it πŸ˜‰
        Gamers don’t use photoshop, care about colour accuracy, or know what viewing angles are…. TN will do fine.

        I guess in the quest for making specs look better on the press release, TN has lower numbers than IPS, but those published response times are just marketing lies:

        [list<][*<]A Philips AH-IPS panel as per many Dell Ultrasharps claims 8ms G2G response times, and is oscilloscope-tested to actually have an [url=http://www.tftcentral.co.uk/reviews/dell_u2414h.htm<]average G2G response time of 8.9ms[/url<].[/*<][*<]A "1ms"gaming TN like [url=http://www.tftcentral.co.uk/reviews/benq_xl2720z.htm<]this one[/url<] uses agressive overdrive to get response times down, but using the exact same oscilloscope testing, the best result is a G2G response time of 3.4ms with a [i<]shocking[/i<] 14% colour error! Firstly, that's a [i<]loooooong way[/i<] off 1ms, and secondly - 14% is woefully inaccurate colour resulting in all kinds of visible overdrive artefacts and fringes around objects. They also tested the TN panel with AMA overdrive disabled, and the "1ms" TN's G2G response was 7.5ms. This makes sense because the fastest TN panels in existance before overdrive came along were about 5ms.[/*<][/list<] So: 8-bit, beautiful 8ms AH-IPS panel = 8.9ms response time with zero colour error and no artefacting 6-bit, woeful 1ms TN panel = 7.5ms response time, or 3.4ms if you want the picture to look even worse than it already is on TN. Clearly, the panel isn't the main thing bringing the response times down, it's agressive overdrive sacrificing image quality for speed. Worse than that though, AMA overdrive calculations add input lag, which is even worse than the blur they're trying to combat in the first place!

      • Bensam123
      • 5 years ago

      Refresh rate is part of the equation, the other part is response time of the pixels themselves. That’s why they use TN panels for things like that.

        • Chrispy_
        • 5 years ago

        Check out my response to HisDivineOrder above.

        IPS G2G response times without artefacting and overdrive errors = 8.9ms
        TN G2G response times without artefacting and overdrive errors = 7.5ms

        It’s not really the panel that makes TN screens fast, it’s that the pixels are overdriven so strongly. I bet you could get an IPS panel to respond in 4ms if you overdrove it like a TN gaming panel. Nobody’s done that yet because why would anyone pay extra for IPS’s colour accuracy and richness to then make it [i<]look awful[/i<] TN is already so bad that people don't complain about overdrive issues πŸ˜‰

    • Firestarter
    • 5 years ago

    bummer, look like my HD7950 won’t support freesync πŸ™

      • spp85
      • 5 years ago

      You can at least do video playbacks more fluidly with Free-Sync with that GPU πŸ˜‰

        • Firestarter
        • 5 years ago

        I don’t need FreeSync to output 24hz video over HDMI

          • spp86
          • 5 years ago

          Fanboy caught RED handed,NOT through HDMI through only DisplayPort 1.2a. Also nvidia G-sync does that same thing as you said 24hz video. .

            • Terra_Nocuus
            • 5 years ago

            Uh… reading comprehension fail? Firestarter was frowning due to his HD7950 [i<]not getting Freesync support [b<]while gaming*[/b<][/i<]. Why would you call him an Nvidia fanboy when he has an AMD card? /smh *edit

            • superjawes
            • 5 years ago

            All right, we have spp85, and now spp86. I have a feeling that the first account got banned and the Hammer is about to fall on this one, too.

            Stop responding to this guy.

            • spp86
            • 5 years ago

            you are a sponsored nVidia person and wanted to sabotage AMD through aggressive marketing through tech forums

            • Terra_Nocuus
            • 5 years ago

            heh, I’ll try πŸ™‚

            • spp86
            • 5 years ago

            See…whenever I comment against that “superjawes” I instantly get -3 negative votes ??? He does have multiple fake account with this site

            • christos_thski
            • 5 years ago

            Crawl back to your hole, you 12 year old fanboy.

            • snook
            • 5 years ago

            he is a gold subscriber and get 3 up or down votes per comment iirc.
            look up, no, go outside and look up. see that?

            exactly

            • spp86
            • 5 years ago

            He said 24Hz through HDMI how is that possible when AMD states specifically through Display port 1.2a. And what he meant by 24hz video ?? If 24hz video means anything then that applies to Nvidia G-Sync aswell. Hopes its clear

            • Terra_Nocuus
            • 5 years ago

            [quote<]He said 24Hz through HDMI how is that possible[/quote<] Since you asked, I believe Firestarter is referring to HDMI's native support for 24 & 30 fps video playback.

            • Firestarter
            • 5 years ago

            exactly

    • spp85
    • 5 years ago

    AMD gave a befitting response to nVidia’s expensive proprietary G-SUCK technology. Soon panel vendors would stop manufacturing G-SUCK monitors due to $200 premium,low demand and a higher chance of monitor failure. Well done AMD well done…..

    • Krogoth
    • 5 years ago

    More hype on band-aids!

    • Alexko
    • 5 years ago

    “We will dutifully subject ourselves to hours of gaming “tests” to get to the bottom of that important question.”

    What a hard job you have! πŸ™‚

      • Airmantharp
      • 5 years ago

      And this is what I’m waiting for. Jury’s definitely still out.

    • superjawes
    • 5 years ago

    [quote<]...it will be interesting to see how the finished products compare to equivalent G-Sync solutions.[/quote<] This is what I'm waiting for. AMD seems to be implying that G-Sync and FreeSync are the same thing, but Nvidia already has monitors in the wild using G-Sync at 144 Hz. The only demonstration I remember of FreeSync was on a laptop, and that is because the original specification was intended to reduce refresh rates in order to save power--not to increase the quality of animation.

      • Firestarter
      • 5 years ago

      from what I’ve gathered so far is that all else being equal, G-Sync will be superior to FreeSync. The question remains is whether it’s a notable difference or not.

        • spp85
        • 5 years ago

        Oh please….. explain how you came with the conclusion that G-Sync is superior than Free-Sync. Dont play fanboy here man…

          • lilbuddhaman
          • 5 years ago

          It costs more, duh.

          edit: and nvidia shit actually works

          • erwendigo
          • 5 years ago

          Gsync runs flawlessly without any other “help” than a nvidia card. Freesync doesn’t.

          Freesync needs a third party soft that enables triple buffer to reach the same results that nvidia with Gsync, because the Gsync module that it’s included with Gsync monitors implements a triple buffer hardware solution.

          Freesync without t-buffer is a intermediate solution between fixed vsync rate and G-Sync. You need a T-buffer to reach the same result, and not all the games are compatible with T-buffer activated with third party software.

            • superjawes
            • 5 years ago

            I wouldn’t say G-Sync only needs an Nvidia card. Current monitors are replacing ASICs with Nvidia produced FPGAs. Ultimately, those FPGAs will be replaced with ASICs, but in the short term, you do need an extra piece of hardware to make G-Sync work.

            • spp85
            • 5 years ago

            How do you know this?? Did AMD explained this to you. Nvidia FANBOY

            • xeridea
            • 5 years ago

            Gsync requires a custom chip on monitor that adds like $300 to the price. Being vendor locked makes it less likely to be used, and you can forget laptops ever using it.

            • superjawes
            • 5 years ago

            Okay, I’ll say it again…

            Current G-Sync monitors are not using the final tech. The premium you are seeing is coming from prototype parts using FPGAs instead of matured ASICs. The latter is much, [i<]much[/i<] cheaper. Also, G-Sync doesn't have to be (GPU) vendor locked. It is right now, but again, the monitors are in prototype phase. Once the ASICs are finished an in full production, any GPU should be able to activate G-Sync functionality, and Nvidia will have the option of just selling the ASICs (or the ASIC design) to manufacturers.

            • spp86
            • 5 years ago

            1. How do you know that nVidia G-Sync is not final?? Does nVidia told to you that personally??

            2. AMD’s Freesync doesn’t need ASIC’s at all. That is almost FREE!!!

            My god you are a terrible sponsored nVidia spokesperson.

            • superjawes
            • 5 years ago

            For the sake of other people reading this…

            1. FPGA = Field Programmable Gate Array. It’s basically a digital chip that can be reprogrammed into new designs. They are very flexible devices, but that also means they are expensive. Good for development; bad for full production.

            ASIC = Application Specific Integrated Circuit. These are custom or “custom” digital chips that perform specific tasks, and you’re etching the design straight into silicon. This means you can’t really use them outside of the application, and they’re expensive to develop, but once they [i<]are[/i<] developed, they are extremely cheap and reliable. Great for full production; impossibly expensive for prototypes, as each design iteration would require completely new tooling. 2. All monitors have ASICs. Since FreeSync requires a BOM change, I am going to assume that means a change to the existing ASICs. Whether it's G-Sync or FreeSync, both will have to touch the silicon.

            • spp86
            • 5 years ago

            So you are pointing that Nvidia makes that ASIC on the future as you said. Production of that specific ASIC design situation will be handled by monitor vendor and that you don’t need to worry about.

            • Terra_Nocuus
            • 5 years ago

            Laptops wouldn’t need it:

            [quote<]Laptops, [Nvidia's Tom Peterson] explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.[/quote<]

          • Firestarter
          • 5 years ago

          I don’t exactly know, but it had to do with the timings. I think it was something like that with FreeSync, the game engine or graphics driver has to try and predict when the next frame will arrive and tell the monitor when it should expect it, whereas with G-Sync, the monitor just gobbles whatever frames the GPU spits out and immediately displays them if possible. G-Sync needs a separate framebuffer in the monitor to allow this and sort of disregards VSync entirely, FreeSync lets the card define a variable VSync interval and relies on the card to honor it.

          But that’s my spotty recollection from groggy memory, so feel free to shoot holes in it all day long. BTW, if I’m any sort of fanboy, I’d be an ATI fanboy.

            • DaveBaumann
            • 5 years ago

            I’ve stated this before, this is incorrect. The Freesync mechanism is in control of the VBLANK timing and such updates when the frame flip occurs. There’s no “predictive” mechanism or anything.

            • Firestarter
            • 5 years ago

            I don’t know where I got that from, but it doesn’t really matter. The proof is in the synced pudding, and that is what I hope to see when these displays hit the review channel. It might be time to prematurely retire my GPU and display!

            • Airmantharp
            • 5 years ago

            I’ve seen that about the web too; but what I really know is that there isn’t enough information available about either technology to really make an educated guess.

      • spp85
      • 5 years ago

      You better wait and see than doing blatant criticism against AMD’s innovation.

        • Terra_Nocuus
        • 5 years ago

        Stating facts is criticism, now? Freesync is based on technology that allowed laptops to lower their refresh rate, thus saving battery life. Links for you:
        [url<]https://techreport.com/news/25867/amd-could-counter-nvidia-g-sync-with-simpler-free-sync-tech[/url<] [url<]https://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo[/url<]

        • superjawes
        • 5 years ago

        I am giving Nvidia a lot more leeway than AMD right now, but that’s because 1) Nvidia presented the idea first, getting everyone excited about it, and 2) Nvidia led strong, showing working prototypes at the announcement. Again, AMD’s only demonstration that I can remember was for laptops, where the base tech was used to reduce power consumption, [i<]not to increase the fluidity of animation[/i<]. I'm willing to take that leeway away if AMD can do the same thing and cut out Nvidia's ability to license the tech, but they actually have to prove that it works first, and they have to prove that it works [i<]at least as well as G-Sync[/i<]. Until then, the burden is on AMD to deliver. Lastly, AMD seems to be implying that Nvidia is going to license this tech everywhere. I've said from the beginning that current G-Sync implementations are all prototypes, using expensive FPGAs. It's not a licensing issue right now, it's a cost of materials one. Theoretically, Nvidia could give out the final ASIC design for free, but they have to finish the design first.

      • puppetworx
      • 5 years ago

      AMD have shown a currently available desktop monitor using FreeSync with just a firmware upgrade also.
      [url<]https://www.youtube.com/watch?v=fZthhqmhbw8[/url<]

        • superjawes
        • 5 years ago

        Thanks, I missed that.

        I’d still like to see tests comparing the two. That video shows a max refresh rate of 60 Hz (16.7 ms refresh time). I’m hoping that the monitor tech (FreeSync) will actually scale to higher speeds, as that’s one of the things that G-Sync has proven to do. Furthermore, I’d like to see if there is a noticeable difference between refresh speeds, in which case the scaling would only be important if you wanted to output something higher than 60 FPS.

      • Waco
      • 5 years ago

      It doesn’t really matter what the original intent was when the final result is identical functionality does it?

      G-Sync made it to market first and refreshes the screen when a new frame is ready.
      FreeSync literally does exactly the same thing…it’s just not out yet. There’s no reason it won’t work identically.

        • superjawes
        • 5 years ago

        G-Sync delivers the frame as soon as it’s ready, FreeSync is still a question, especially considering that many descriptions have mentioned buffering (which means a delay between frame complete and frame display).

        That could just be the original tech, and AMD is just getting rid of the delay, but if there’s a difference between the technologies, then we can’t assume that the results will be the same. On top of that, there could be a scaling issue getting to 120-144 Hz that Nvidia is already in the process of fixing with the G-Sync prototypes.

        If they are the same, great. It means that we’ve got more people working on implementing the full production versions. However, if they are different, I would like to know how they are different, and what that means as someone looking for a new monitor.

          • DaveBaumann
          • 5 years ago

          Ironically its G-Sync that’s doing some level of buffering, hence the large pool of memory. Freesync requires no buffering – the GPU is aware of the timing ranges from the panel and just controls the VBLANK signal, timing it with the application frame flip.

            • MathMan
            • 5 years ago

            You do know that the presence of a buffer doesn’t imply additional latency, right?

            As was explained by Tom Petersen in their follow-up interview after the mess that Huddy made: the memory is used as a look-aside buffer. This can be done without any latency.

            So unless you have actual proof that it introduces real delay, you’re just spreading FUD.

            • Airmantharp
            • 5 years ago

            The real answer is this:

            We don’t know. For G-Sync or for FreeSync. We don’t know which one costs more at the hardware level, and we don’t know which one provides the ‘best’ experience; and we don’t know which one will really become ‘the standard’.

            • DaveBaumann
            • 5 years ago

            The original reviews point out that there is ~1ms delay for polling.

            • MathMan
            • 5 years ago

            Delay due to polling is orthogonal to delay due to a memory.

            But I’m happy you mention that it’s only 1ms: could you educate your chief gaming scientist about that?

            He said in the interview that the buffer introduces a full frame of latency. (IOW: that’s at least 6ms.)

            • superjawes
            • 5 years ago

            I should add that I did find a few reviews that mention the ~1ms delay due to polling, but each one also said that Nvidia was working to eliminate that.

            Also, just because FreeSync doesn’t do that doesn’t mean that there isn’t something else that could introduce a delay, which is why we need to see these two technologies go head-to-head.

            • Flapdrol
            • 5 years ago

            …….

            • Klimax
            • 5 years ago

            That large pool of memory is dominantly FPGA thing, not G-SYNC. As if you didn’t know how FPGAs look and work. (HINT: This is altered generic board)

            • DaveBaumann
            • 5 years ago

            [url<]http://anandtech.com/show/7582/nvidia-gsync-review[/url<] "The added DRAM is partially necessary to allow for more bandwidth to memory (additional physical DRAM devices). NVIDIA uses the memory for a number of things, one of which is to store the previous frame so that it can be compared to the incoming frame for overdrive calculations"

            • MathMan
            • 5 years ago

            Exactly: and this can be done without any additional latency if implemented as a look-aside buffer.

            • MathMan
            • 5 years ago

            If this is an altered generic board, it should be a breeze to find an equivalent one on the web.

            (There is no way this is generic…)

            • Airmantharp
            • 5 years ago

            I doubt that the board is generic, as it is part of a customized module- but the FPGA is, and I’m pretty sure that is what Klimax meant, i.e. the currently available G-Sync module is cobbled together using off-the-shelf parts.

          • Waco
          • 5 years ago

          Different methods of doing the exact same thing should produce identical results. Perhaps that’s just me being hopeful but the end result of either method will produce the same result of frame being displayed when they’re ready, in full, without tearing.

      • spp86
      • 5 years ago

      [quote<].........The only demonstration I remember of FreeSync was on a laptop, [/quote<] That was only AMDs intelligent beginning. Rest follows. You are trying to undermine AMDs attempt for countering nVidia's G-SYNC. Sounds like a fanboy

    • meerkt
    • 5 years ago

    For movies you can use multiples of the framerate, so no real need to go as low as 24Hz. It would actually be better to use higher refresh rates for the mouse and UI.

    But in games, dips to 15-20Hz are still playable, depending on how often it happens and the game type. Unlike movies the framerate is not predictable, so you can’t use multiples. So it would make more sense to support even lower refresh rates.

      • GrimDanfango
      • 5 years ago

      I fail to grasp why there seems to be the need for a minimum at all. Seeing as the basic essense of GSync/FreeSync is “hold current frame until you receive a refresh request”, I can’t see why there’d be any technical limitation stopping it from holding a frame for 1/2 a second, when 1/24, 1/37, 1/53 are all perfectly achievable.
      Obviously there’s a reason for a maximum… but does anyone know why the current GSync monitors specify/require a minimum?

        • Zizy
        • 5 years ago

        I think it is mostly because these transistors leak and need to be refreshed after a while. But panel self refresh (without any GPU command) could be used to achieve arbitrarily long times between 2 different pictures from GPU, assuming protocol allows that.

        • willmore
        • 5 years ago

        I would imagine it’s because the monitor would need to keep a copy of the frame internally to self-refresh and that costs storage. Also, what if the monitor decides that it needs to refresh and starts to do so–just as the PC decides that it’s got the next frame ready and starts sending it?

        There needs to be some level of communication. And that’s where you get this idea of ‘supported rates’. It’s basically the monitor saying “I must be redrawn every X ms, you take care to do that.” It pushes the complex stuff over to the PC where hardware is cheap and flexable.

          • Zizy
          • 5 years ago

          Power consumption should be lower with panel self refresh than with GPU resending same data constantly – not an issue for PC, but matters for phone/laptop. Cost difference – FHD frame is 2MP*4B/P = 8MB and that is relatively cheap with edram and still manageable with esram.
          If PC starts sending next frame, panel could simply stop self refresh and immediately start drawing new frame, assuming its self refresh implementation allows that. Plus you have the same problem if GPU starts sending same frame data and gets a new one ready.

            • willmore
            • 5 years ago

            The pannels in laptops and tablets have very dumb displays–they have *no* storage aside from a shift register for the current line being drawn. Going to a smart controller capable of doing self refresh in these applications would be a major shift in how things are done.

            For an external monitor, things are different.

          • GrimDanfango
          • 5 years ago

          Yeah, good points. I did just think of the “what if it’s already self-refreshing when a request comes through” thing.
          So the limitation is on how long a panel can hold a charge before leaking then. I think 15hz would be an ideal minimum for panels to be able to cope with… that would cover occasional drops below 20-25 for heavy games, while presuming that if it drops any lower, you’re into the realms of unplayable-slideshow anyway.

    • JustAnEngineer
    • 5 years ago

    Let us know if NVidia adopted DisplayPort 1.2a with GeForce GTX880.

      • BitBlaster
      • 5 years ago

      ..and the GTX 760

      • Airmantharp
      • 5 years ago

      One would think that a smart Nvidia would support both solutions; regardless of which solution is superior (however that gets defined), I highly doubt that Nvidia could make more money from G-Sync licensing than they could through potentially greater sales.

        • MathMan
        • 5 years ago

        They’re no licensing anything. They’re selling hardware that solves a problem.

        A lot of people seem to have a problem with that concept.

      • MathMan
      • 5 years ago

      I’m sure they will be very compatible with the mandatory parts. Whether they support all the optional parts is a different story.

Pin It on Pinterest

Share This