Nvidia responds to AMD’s ”free sync” demo

CES — On the show floor here at CES today, I spoke briefly with Nvidia’s Tom Petersen, the executive instrumental in the development of G-Sync technology, about the AMD "free sync" demo we reported on yesterday. Alongside the demo, a senior AMD engineering executive asserted that a variable refresh rate capability like G-Sync ought to be possible essentially for free, without adding any extra costs to a display or a PC system. Peterson had several things to say in response to AMD’s demo and claims.

He first said, of course, that he was excited to see his competitor taking an interest in dynamic refresh rates and thinking that the technology could offer benefits for gamers. In his view, AMD interest was validation of Nvidia’s work in this area.

However, Petersen quickly pointed out an important detail about AMD’s "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.

That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia’s own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia’s intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction.

When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn’t think it’s necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That’s why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists.

That said, Nvidia won’t enable G-Sync for competing graphics chips because it has invested real time and effort in building a good solution and doesn’t intend to "do the work for everyone." If the competition wants to have a similar feature in its products, Petersen said, "They have to do the work. They have to hire the guys to figure it out."

This sentiment is a familiar one coming from Nvidia. The company tends to view its GeForce GPUs and related solutions as a platform, much like the Xbox One or PS4. Although Nvidia participates in the larger PC gaming ecosystem, it has long been guarded about letting its competitors reap the benefits of its work in various areas, from GPU computing to PhysX to software enablement of advanced rendering techniques in AAA games.

Like it or not, there is a certain competitive wisdom in not handing off the fruits of your work to your competition free of charge. That’s not, however, how big PC players like Intel and AMD have traditionally handled new standards like USB and x86-64. (Intel in particular has done a lot of work "for everyone.")

If you recall our report from yesterday on this subject, Nvidia and AMD do seem to agree on some of the key issues here. Both firms have told us that the technology to support variable refresh rates exists in some cases already. Both have said that the biggest challenge to widespread adoption of the tech on the desktop is support among panel (and scaler ASIC) makers. They tend to disagree on the best means of pushing variable refresh tech into wider adoption. Obviously, after looking at the landscape, Nvidia chose to build the G-Sync module and enable the feature itself.

My sense is that AMD will likely work with the existing scaler ASIC makers and monitor makers, attempting to persuade them to support dynamic refresh rates in their hardware. Now that Nvidia has made a splash with G-Sync, AMD could find this path easier simply because monitor makers may be more willing to add a feature with obvious consumer appeal. We’ll have to see how long it takes for "free sync" solutions to come to market. We’ve seen a number of G-Sync-compatible monitors announced here at CES, and most of them are expected to hit store shelves in the second quarter of 2014.

Comments closed
    • Kaleid
    • 6 years ago

    The industry is run by morons. Everyone would benefit if they developed an industry-standard for all to use.

    • deb0
    • 6 years ago

    So sick of AMD’s constant whining and lack of engineering leadership. Since they believe dynamic refresh should be free, then they should spend their own treasure in making it happen. The most promising technology that’s come from the AMD camp is Mantle, and that has yet to come to fruition. Nvidia has every right to protect their intellectual property. For them to share G-sync would be foolish and allow other firms to reap the benefits of their research and development without spending a dime.

    I think dynamic refresh is a long way away from being worth the cost. Technology in monitors are taking some serious leaps every year.

      • sschaem
      • 6 years ago

      Please read cygnus1 post.

      • Diplomacy42
      • 6 years ago

      [quote<] Since they believe dynamic refresh should be free, then they should spend their own treasure in making it happen[/quote<] they did and it is? that was the point of their demonstration using off-the-shelf laptops that were 3 years old.

    • ThorAxe
    • 6 years ago

    Display Port has been available on GPUs since 2009? How many monitors have one today?

    Free sync is pie in the sky. Don’t hold your breath.

      • Reuel
      • 6 years ago

      If enough of us demand a feature before buying, we create a market for that feature.

      • pandemonium
      • 6 years ago

      I’m confused… Are you trying to say that DP is rare? I wouldn’t consider 30% of Newegg’s inventory – starting at $136 and up – having it rare…

      [url<]http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007617%20600050983%20600050982&IsNodeId=1&name=1[/url<]

        • indeego
        • 6 years ago

        Actually, I do consider that fairly rare a good 5 years after introduction. This is display, not some optional technology that people don’t necessarily need. 30% of the market is low enough which is why we *still* see VGA and DVI on graphics cards/laptops/projectors/etc.

          • pandemonium
          • 6 years ago

          Point is, there’s a monitor out there for every price point with DP regarding size, hz, quality, manufacturer…

          That’s not rare.

    • Bensam123
    • 6 years ago

    So at the end of the day this is starting to look like a power play by Nvidia to lock a market share into their branded monitors that would otherwise naturally come along with DP 1.3 in six months? Good move Nvidia… Not sure how they didn’t expect this to be exposed and AMD to come up with a solution for it though.

    That’s all the more reason to allow their competition to use it. I think their tech hording this time around will definitely bite them in the ass.

      • Reuel
      • 6 years ago

      Hey, 6 months of sales is better than zero sales. I hope it paid for the money they blew designing the FPGA.

      Actually, on second thought, I hope they take a bath on this. Slimy.

    • UnfriendlyFire
    • 6 years ago

    Anyone remember when IBM told 3rd party manufacturers to piss off after launching the MCA mobo interconnect (Micro Channel architecture)?

    That’s why VESA Local Bus was developed. Even though it had issues, it was widely adopted because there were no license fees or patent walls to deal with.

      • someuid
      • 6 years ago

      Yep, I do. I also remember MCGA graphics, which was another IBM power play that failed. And let’s not forget some attempts to corner and control a market that have succeeded, like the Microsoft Office file format. Even with a move to open XML, Microsoft poisons it with a proprietary binary field to ensure competing products are kept locked out.

    • Deijya
    • 6 years ago

    “we can either sit around and debate about improving refresh rates and reducing latency, or we can create a solution and profit from it.” – nvidiaism

    • kilkennycat
    • 6 years ago

    It may have missed general notice, but G-sync not only addresses many of the problems with traditional Vsync, but is also a minimum-latency technology.

    nVidia recommends a mouse with a very fast update rate, ( maybe like the Logitech G602…500 position- scans per second )…..for true twitch-gamers to take full advantage of the low latency benefits.

    • WaltC
    • 6 years ago

    [quote<]However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.[/quote<] Well, duh...;) Wasn't it obvious from AMD's statements that the laptops were being used for a specific reason relative to the AMD presentation? And didn't AMD state clearly that desktop monitors would have to be configured to respond to dynamic refresh rates that had already been supported in the Catalysts for years? I believe so, so Peterson's point seems elusive. nVidia has yet to sell me on G-Sync--simply because the "example" demonstrations purporting to show the differences all greatly exaggerate the jerky "stutter" effect supposedly seen on all non G-Sync displays. As stated already more than once, my 60Hz, vsync-off, displays run *much better* than any of the non G-sync examples I've seen to date, both here and from AnandTech. If you have to exaggerate the deficiencies in a non G-Sync display in order to sell the benefits of G-sync, you've already failed, imo. Still have an open mind, though, and await some convincing evidence--other than what looks like purposefully degraded non G-Sync displays done just to make G-Sync look better by comparison than it evidently is. That is definitely the wrong way to sell G-Sync, and I daresay few people will be fooled by such an approach.

    • rechicero
    • 6 years ago

    So, if I didn’t understood wrong, the main problem is the scaler chip. So, for Free Sync you’d actually need a cheaper monitor and let the GPU do the scaling (or the DDM wilmore said in #41). The dumber the monitor, the better.

    • iatacs19
    • 6 years ago

    The most tangible benefit of G-Sync is that it’s available now. You can buy it and use it. Other solutions mentioned such as FreeSync or DP 1.3 are coming, but they are not real products you can buy and use.

    Until there is a real alternative you can buy, G-Sync is the only “real” product in the market.

      • l33t-g4m3r
      • 6 years ago

      Lucid’s Virtu, and why can’t AMD/NV copy how they’re doing that?

        • l33t-g4m3r
        • 6 years ago

        I like how the trolls are completely dismissive of lucid’s technology. Is it perfect? No. However, that’s not the point. Lucid’s proven virtual vsync is possible, so the real question is: why can’t amd/nv copy how they’re doing that?

        You don’t need to buy into gsync or freesync if you can run Virtual Vsync on existing hardware.

      • Reuel
      • 6 years ago

      FreeSync is no more unavailable now than G-Sync is. Both require you to buy a certain type of monitor. For G-Sync, that is monitors with the module installed in it; for FreeSync it is laptops that use eDP.

    • Elsoze
    • 6 years ago

    Does anyone know if this sync ability will help/change the micro stutter effect in SLI/Crossfire card setups?

      • Meadows
      • 6 years ago

      Yes, and the answer is no.

        • Reuel
        • 6 years ago

        Meadows is right. This fixes stutter due to Vsync. SLI stutter comes from problems in creating the frames themselves. Different sources.

    • hubick
    • 6 years ago

    My money will be going to whatever company attempts to work to create standards compatible across vendors, and then compete based on the quality of their implementation instead of vendor lock-in.

    • Meadows
    • 6 years ago

    If you see this so-called “Free Sync”, report him. Civic deeds do not go unrewarded. And contrariwise, compatibility with his cause will not go unVidia.

    Be wise. Be safe. Be aware.

    • Mat3
    • 6 years ago

    Their response did not have the condescending tone to it that I had expected when I read the headline.

      • clone
      • 6 years ago

      agreed and while the tone was level headed I believe Nvidia should have gone for a licensing deal instead of foolishly trying to make a soon to be mainstream feature exclusive.

    • Prestige Worldwide
    • 6 years ago

    New title:
    nVidia pulls a Krogoth, is not impressed with FreeSync

    • Bensam123
    • 6 years ago

    And Nvidias implementation will be left in the dust after AMD implements a open standard. AMD has nothing to lose if it’s in second place. Even if everyone knows about G-Sync now, it will be remembered like firewire after it’s replaced and forgotten about. Recognition and visibility means a lot and Nvidia is starting to lose on that front. After it’s replaced it’ll be like it wasn’t even first around in the first place.

    They can money grudge all they want, but allowing a competitor to use your standard isn’t the same as giving away your trade secrets. This is like a parlor trick, AMD already has a partially working version on extremely short notice. They wouldn’t be breaking the bank by giving this one out for free.

    • ronch
    • 6 years ago

    Nvidia – The Way It’s Meant To Be Synced.

    Edit – /sarcasm

      • LoneWolf15
      • 6 years ago

      This was funny. So not worthy of downvote, IMO.

    • anotherengineer
    • 6 years ago

    ” Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel.”

    So how come regular desktop monitors can’t be made on LVDS or eDP standards??
    Or can they?

      • willmore
      • 6 years ago

      Pretty much because the desktop standards do a lot more device abstraction than do the laptop interconnects.

      For the desktops, power isn’t as much of a concern, so the extra layers(s) of abstraction–and the power they use–isn’t much of an issue for the additional flexability they provide. For a notebook, an extra 1-2W of constantly used power would be a big deal.

      That said, DP does support very dumb monitors using a feature called DDM (Direct Display Monitor) which pretty makes them eDP over a DP cable.

      • Spazturtle
      • 6 years ago

      DisplayPort 1.3 does, it is being finalised this year.

      • bhtooefr
      • 6 years ago

      The LVDS (actually, FPD-Link being the more correct name for it) standard was actually tried as an external interface. OpenLDI was a variant intended for external stuff. Very little stuff actually used it, the most notable being the SGI 1600SW.

      eDP is actually not very different from regular DP, but does include some things that haven’t made it into the main DisplayPort spec yet.

    • jihadjoe
    • 6 years ago

    [quote<]That said, Nvidia won't enable G-Sync for competing graphics chips because it has invested real time and effort in building a good solution and doesn't intend to "do the work for everyone." If the competition wants to have a similar feature in its products, Petersen said, "They have to do the work. They have to hire the guys to figure it out."[/quote<] This sort of attitude is exactly why Nvidia doesn't get the same sort of respect as an industry leader that Intel does. When you're in a leadership position sometimes it's actually to your benefit to do work "for everybody". Good will aside, increased sales of bigger, higher resolution monitors mean more GPUs sold, and as the industry leader in terms of market share they stand to gain the most.

      • chuckula
      • 6 years ago

      [quote<]That said, [s<]Nvidia[/s<] [u<]AMD[/u<] won't enable [s<]G-Sync[/s<] [u<]Mantle[/u<] for competing graphics chips because it has invested real time and effort in building a good solution and doesn't intend to "do the work for everyone." If the competition wants to have a similar feature in its products, Petersen said, "They have to do the work. They have to hire the guys to figure it out."[/quote<] Funny how what goes around comes around.

        • NeoForever
        • 6 years ago

        :`( But.. but.. AMD! But.. But.. Mantle! See? See?!
        jk

        • wierdo
        • 6 years ago

        My understanding is nVidia could adopt Mantle if they wanted to, but the obstacles are technical rather than legal. At least that’s the initial claim from some reps.

        • ThorAxe
        • 6 years ago

        Free Sync? More like Wishful Syncing.

        • 200380051
        • 6 years ago

        Ohh…you obviously didn’t read much about Mantle, did you?

        Nvidia are locking everyone out of G-Sync. AMD wil be inviting everyone to implement a Mantle driver by making their API open…

        Your argument is invalid.

          • Timezone
          • 6 years ago

          his arguments are almost always invalid.

      • erwendigo
      • 6 years ago

      [quote<]This sort of attitude is exactly why Nvidia doesn't get the same sort of respect as an industry leader that Intel does.[/quote<] Yes, intel, the same one that doesn´t free or license the x86 instruction set to another one. All, nvidia, AMD, intel, have many propietary and heavily closed technologies, and viceversa.

        • jihadjoe
        • 6 years ago

        The big difference is in identifying which technology is key to your business and should be kept closely guarded, and which technologies are auxiliary enablers to it and can be made open to benefit the ecosystem you play in.

        Nobody is expecting Nvidia to give away the Kepler architecture, or for AMD to open-source GCN, but technology that ought to be part of the display itself and the connecting cable maybe could benefit all players involved by being an open standard.

        PCI and PCIe would have quickly gone the way of MCA if their use was limited only to Intel and certain select partners.

        • Reuel
        • 6 years ago

        Doesn’t AMD have a x86 license from Intel?

          • Airmantharp
          • 6 years ago

          And Intel has an x86-64 license from AMD.

            • Fighterpilot
            • 6 years ago

            So tell us again how Free-sync won’t work….you know…the bit about how FPShooters will be the only benefit LOL
            Just let it go dude…..the shovel is buried up to the handle so far.
            /facepalm

            • Airmantharp
            • 6 years ago

            Your impression of clone is… disturbing.

            • Reuel
            • 6 years ago

            Maybe clone was temp banned and this is his other account?

            • Klimax
            • 6 years ago

            Forced by Microsoft, otherwise Intel would use their own…

            • Airmantharp
            • 6 years ago

            That was called IA64, and while it is still very interesting (to me at least), it’s definitely not desktop material. It’s certainly not x86-anything.

            • Klimax
            • 6 years ago

            I didn’t mean IA64…
            Intel had their own back up 64bit solution based on x86. They even presented it to Microsoft, but MS refused to support another x64 variant. That’s also why Intel could adapt AMD64 that fast. (IIRC there are however few instructions which differ in opcodes and semantics)

            • Airmantharp
            • 6 years ago

            Well, there was that pseudo-64bit stuff that they had in the Prescott? Pentium IV series and beyond. Intel seemed to respond pretty quickly when Microsoft went with X86-64 and AMD had said silicon on the market.

            • Klimax
            • 6 years ago

            Correct, they used their own work as basis for fast turnaround. Intel loves back-up options… and they can afford them. (Best example would be Pentium 4 as back up for Itanium and Core for Pentium 4)

            If Microsoft wouldn’t force use of AMD64, then Intel would finish their own extension and cause further problems to AMD.

      • superjawes
      • 6 years ago

      It seems unecessary for them to take this stance, too. As I’ve said before, as long as you’re using existing cables, anyone should be able to use it as long as they know how. So Nvidia could finish the ASIC and sell it to Samsung, LG, Dell, etc. along with stickers, and even though AMD can use the feature, everyone would remember that it’s an Nvidia developed feature thanks to the term “G-Sync” and they’d get the revenue from ASIC sales to boot.

      But instead, it sounds like they’ll protect their trademark and stifle adoption…genius.

        • tcubed
        • 6 years ago

        Well you can always count on NVIDIA to shoot itself in the foot like this.

    • superjawes
    • 6 years ago

    So basically, the AMD demonstration was a bit misleading, and Nvidia’s work with G-Sync is implementing a feature not currently available on desktop monitors.

    I am a bit conflicted on the back end, though. On one hand, I understand Nvidia’s desire to get paid for the work they’ve done. ASICs aren’t cheap to develop, and they’re clearly footing the bill for development and even distribution of G-Sync into monitors.

    On the other hand, I think they’re going to limit availability of G-Sync by holding it close to the chest. They ought to just finish the design and make their money by selling the ASICs to monitor makers. Sell it to everyone and they’ll make back the money they invested.

      • shank15217
      • 6 years ago

      Misleading? Its a tech demo.. come on

        • superjawes
        • 6 years ago

        The implication of AMD’s tech demo was that the feature was already available and implemented in systems. That is not the case (and many people pointed that out in the earlier thread), so what purpose does it serve?

          • NeoForever
          • 6 years ago

          The implication was that the feature was implemented in [b<]GPUs[/b<], not [i<]systems[/i<]. It serves the purpose of telling people, that 1) The monitor manufacturers weren't interested or it would have happened by now. 2) That hurdle with be removed now that Nvidia's got manufacturers' interest. And 3) "Get media interested in learning about this feature of eDP 1.0 and DP 1.3" - Koduri

          • Reuel
          • 6 years ago

          But it is available and implemented in systems that use eDP, like AMD used in the demo. Soon it will be in any monitor that uses DP v1.3 as well.

            • Klimax
            • 6 years ago

            What is time lag for implementation of new standard including effective implementation of this feature?

            • Reuel
            • 6 years ago

            It’s already implemented in systems with eDP, right now. You probably mean when will it be ready for everything else?

            I hear DP v1.3 won’t be done for another 2 months, and it takes a few months after that for it to appear in anything. So maybe in May? June?

            • Klimax
            • 6 years ago

            And I’d say rather optimistic. (BTW: Yes, I was talking about DP 1.3 for regular displays)

            Also it will be interesting to see how quality implementation will be.

      • willmore
      • 6 years ago

      They want to get paid for implementing a feature that was already in the pipeline of the real display driving ASIC vendors slightly ahead of them and (supposition) in a non-standard compliant way. How’s that good?

        • Reuel
        • 6 years ago

        Exactly. It isn’t good.

        • Klimax
        • 6 years ago

        Because currently ASICs are big bloody unknowns and it might so happen that they won’t implement thing well.

        It will be interesting to observe how full DP 1.3 implementations will fare against G-Sync scaler.

      • Aerugo
      • 6 years ago
      • sschaem
      • 6 years ago

      Gsync is a stopgap solution on desktop but is not needed on mobile as the spec and GPU already support the feature.

      AMD is just reminding us all that GPU have the same gsync HW feature built in,
      And the tech demo proves it.
      AMD is just not as bold as nvidia, and. Is just waiting for the spec to extend officially to desktop vs creating a stopgap solution.

      I have no idea how anyone can slam nvidia or AMD on this subject.

        • Reuel
        • 6 years ago

        I’ll tell you how: the stopgap is a waste of money. I prefer open standards.

        [quote<]I have no idea how anyone can slam nvidia[/quote<]

    • ronch
    • 6 years ago

    So who else here thinks Nvidia and AMD will never agree on anything?

      • Chrispy_
      • 6 years ago

      They’re bitter enemies; They will both always agree that the other is wrong.

      • chuckula
      • 6 years ago

      They both can’t stand Intel!

        • Klimax
        • 6 years ago

        Not until NVidia gest 14nm chips out of Intel’s fabs…

      • NeoForever
      • 6 years ago

      Only when the the govt. unites them into one company so that there can be a monopoly 😛

    • Tristan
    • 6 years ago

    NV are stupid liars !!!
    Current DP 1.2 do not support dyamic VBLANK as all. They are using DP AUX channel to transfer these VBLANK commands to g-sync, in non-standarized way. So, for such non-standard solutions, current DP 1.2 is enough, and lies look like true.
    Let these NV idiots encode these VBLANK commands into rendered images, and spread their stupid lies to the word, that d-sub is enough for everyting !!!

      • willmore
      • 6 years ago

      Hmm, not sure if insightful ravings or demented ravings……

        • Tristan
        • 6 years ago

        If you do not understand simple statements, do not respond.

          • l33t-g4m3r
          • 6 years ago

          Spellcheck might help.

            • Melvar
            • 6 years ago

            It doesn’t meter if you user a spellchecker or knot, they still spread their stupid lies to the word.

      • NeelyCam
      • 6 years ago

      Fanboi rage?

    • ronch
    • 6 years ago

    Nvidia wants you to pay for it. AMD thinks they can give it to you for free the same way they practically want to give everything for free. No wonder they’re not earning a lot.

    /sarcasm

      • shank15217
      • 6 years ago

      Sorry but cuda has been a curse on gpu programming since 2008, we don’t need another cuda.

        • erwendigo
        • 6 years ago

        Yes, the “curse” that is the MODEL for developing the OpenCL structure, or in a less extend case, DirectCompute. It´s the father for all this alternatives.

          • Scrotos
          • 6 years ago

          Well, CUDA 1.0 was released Monday 25th June 2007 and Brook was demonstrated at SIGGRAPH in 2004 ( [url<]http://graphics.stanford.edu/papers/brookgpu/[/url<] ) so you could argue that CUDA used Brook as a model itself and if CUDA is the father, BrookGPU is the grandfather. Unless you're going to try and make the case that Cg invented everything ever forever as far as "programming" on the GPU.

    • Pantsu
    • 6 years ago

    While I don’t agree on Nvidia’s attempts at making things like this proprietary and splitting the market, I do applaud them for innovating and actually putting in the effort to bring the feature to the consumers. It’s why it’s good to have competition, both sides come up with things that the other side didn’t think of. What’s bad for consumer is they don’t like to share their ideas and make open standards of them. If they would be willing to work together on ideas and share the development cost it would be much better for the consumer, but I suppose that’s wishful thinking.

    So after all FreeSync isn’t inferior like some detractors were quick to claim without any proof. All it needs is a new supporting ASIC for the monitors. According to AMD it shouldn’t be much more expensive than what the monitor manufacturers already do, so it’s just a matter of them making one. Obviously so far they haven’t seen the need, but if AMD exposes the feature, it should be an easy thing to implement and check one more feature box with little cost attached.

    Nvidia obviously has other reasons to keep G-Sync since they also do the 3DVision thing and strobing backlights tech, so it makes sense for them to develop the ASIC and sell it to the monitor manufacturer and sell Nvidia branded monitors for a little extra – after all Nvidia fans never seem to mind the extra cost.

      • Reuel
      • 6 years ago

      That’s not what they did. nVidia took an upcoming standard and made their proprietary version out of it in an attempt to be first to market, which, let’s face it, isn’t hard to do given the molasses like speed of standards’ committees.

      [quote<]I do applaud them for innovating[/quote<]

    • cygnus1
    • 6 years ago

    Let me see if I get what’s happened here. VESA was doing their thing a few years ago and created a free, better version of HDMI, aka displayport. Laptop people said, ‘sweet, a free display interface, let’s get that extended a bit for our use’. eDP is born, [url<]http://en.wikipedia.org/wiki/DisplayPort#eDP,[/url<] and it includes the power saving feature for "seamless refresh rate switching". AMD and nVidia support it in silicon because it's in the standard and they're unifying their mobile and desktop silicon. Normal desktop monitors don't support it since it's an embedded interface, so nobody notices for years. VESA keeps doing their thing with regular old displayport, keeps making it better, starts planning v1.3 a while ago and which is now due out this year. nVidia sees this coming, since they're a member of VESA. Somebody at nVidia decides, let's pitch this old idea that's about to come to desktops anyway as our own before everybody else supports it because it's the freaking standard. Also, rebrand it so it looks like nVidia came up with it too. (copied that play from Apple I think) Basically all nVidia has done is produce the silicon for a monitor to support a feature of displayport 1.3 before it's a ratified standard and before anybody else. Or hell maybe gsync is just eDP, who knows since nVidia doesn't talk about their 'secret sauce' like AMD does when the secret sauce is really an open standard. I will bet that in less than 18 months crap loads of monitors will support this.

      • Hattig
      • 6 years ago

      They haven’t even produced the silicon, it’s currently still at the FPGA stage! Which is one reason it costs so much.

      I’m sure all the traditional scaler chip manufacturers will have DP1.3 supporting chips out in real silicon this year for the DP1.3 standard, and hence FreeSync will be available everywhere.

      I have no issue with Nvidia deciding to get into the monitor scaler market, but to tie their scaler in to their computer GPUs is just a bit shitty. You don’t need a Sony TV to use a PS4, nor a TV with Microsoft certification to use an XBox One, so I disagree with the article’s assertion about platforms.

      Thankfully because Nvidia decided to overengineer and rebrand technology that is coming in a standard (and has existed for a while anyway), monitor makers will soon ignore the proprietary solution that limits the market, and use the standard solution that works everywhere.

        • Reuel
        • 6 years ago

        FPGAs aren’t made out of silicon? Wow! What new material did they discover?

          • modulusshift
          • 6 years ago

          😛 of course, he means that they haven’t even produced a prefab chip. Not like they produced the FPGAs anyway, they bought them from someone else and programmed them.

            • Reuel
            • 6 years ago

            It was a funny joke, I’m not sure why people hated it. =P

            Yes, usually when somebody says they “made” a FPGA, they mean programming it.

      • psuedonymous
      • 6 years ago

      Nah, that Satellite was using an LVDS panel rather than eDP. AMD were driving it directly over the LVDS interface rather than passing it DP and letting the monitor controller handle the panel driving.

      • Aerugo
      • 6 years ago
        • cygnus1
        • 6 years ago

        I’ve held off as well. This is a feature that I really want and learning more of the background behind its development is starting to worry me. I’ve gone back and forth between GeForces and Radeons over the years and it just so happens that my current gaming PC is running nVidia SLI (thanks to an amazon double shipping error). As a current nVidia user, since they’ve implied g-sync is proprietary and they don’t want to share it, I’m now worried they’re not going to support the standard and stick to only working with monitors blessed by them.

        The way I see this feature going is like this: AMD and Intel will support the VESA standard in DP 1.3 and therefore monitor manufactures will have to choose between supporting only the larger Intel/AMD user base for $$ vs nVidia and/or both for $$$$.

        Which is where my worry comes in, this looks like nVidia is going to carry a premium to get this feature that other vendors won’t or won’t be as large.

          • clone
          • 6 years ago

          the introduction of G-sync and the announcement of Free Sync has gotten me really excited about buying a new display….. far more than 2k and 4k have.

          I was casually thinking of buying a 2k this year but that’s on hold indefinitely until Free sync or something like it gets to market.

          I won’t support Nvidia’s G-sync because it’s proprietary but I’d jump on free sync in a heartbeat.

          • sweatshopking
          • 6 years ago

          did you attempt to send the extra card back? or did you just keep it?

            • cygnus1
            • 6 years ago

            They said keep it if it shows up, which it did

            • sweatshopking
            • 6 years ago

            that’s a heck of a deal!

            • cygnus1
            • 6 years ago

            I thought so too. Good enough payment for the original box not showing up for several days after it was supposed to

          • Aerugo
          • 6 years ago
        • pandemonium
        • 6 years ago

        I was somewhat discouraged when the news broke about Gsync after I had just purchased a new monitor, but then thought to myself, no regret for several reasons.

        -The Gsync/freesync will be available as an add-on that will be compatible with most monitors – if not all – eventually, so if I wanted to update my monitor I could do so myself (I have a soldering iron and not afraid to get my hands on it).
        -My monitor died and I needed to update to something better than my cheapy backup I was using anyways.
        -I’d rather not spend the money on the premium if I didn’t have to from Nvidia and end up waiting and waiting anyways.

        My thoughts are when I hear about new tech that’s coming out: [b<]don't wait[/b<]. Every time I've waited, I was led to disappointment and realized I should've just went ahead with a purchase anyways.

      • green
      • 6 years ago

      from my understanding the general difference will be availability

      g-sync is a hardware “mod” that can be added on to “very limited range” of existing monitors
      so if you forked out $500 on a nice compatible monitor, an all up cost of $100 for a g-sync module + labor isn’t that bad of a deal to get very smoothed out frame rates
      i would assume there’s a small list of monitors already upgradable
      i wouldn’t expect that list to get very large / broad though

      meanwhile free-sync capability will be part of of displayport v1.3.
      there is not a monitor you can “upgrade” to use the newer version without someone somewhere outright buying or wrecking, what at the time will be, a existing displayport v1.3 monitor

      otherwise, yes. i don’t get why someone would outright buy a “g-sync monitor”.
      that is of course unless some future update could mean free hardware level double buffering + variable refresh
      that way all the video card “ever” need do is send the next complete frame up and let hardware handle when to switch

    • TwoEars
    • 6 years ago

    So many people in here that want something for nothing.

    I’m glad nvidia took this step, I understand and believe in the technology.

    Perhaps it isn’t what we’ll be using 5-10 years from now.

    But I’ll be more than happy to buy a g-sync monitor right now.

      • renz496
      • 6 years ago

      well no pain no gain. even nvidia’s tom they will not doing it others job for them. at the very least nvidia is honest about that. since AMD seems confident this can be with no extra hardware (g-sync module) i hope they will show free sync runnig games soon. when nvidia come up with g-sync they run real games to convince the people.

      • Sabresiberian
      • 6 years ago

      Yah regardless of whether or not you think Nvidia is trying to sell unnecessary hardware, there is a benefit to be had that we wouldn’t know about without Nvidia bringing it to light. AMD claims to have known about it all along – but did nothing with it.

      Seems to me that the one portrayed as greedy here is doing me more benefit than the one portrayed as “open and free”.

        • Reuel
        • 6 years ago

        AMD can’t do anything until DisplayPort supports the feature in version 1.3. eDP already supports it, which is how they were able to do this demo.

        [quote<]AMD claims to have known about it all along - but did nothing with it.[/quote<]

    • Spunjji
    • 6 years ago

    Brilliant! Could be summed up as “nVidia exec admits that G-Sync is over-engineered”.

      • Klimax
      • 6 years ago

      Wrong. Aka selective reading and wrong conclusion reached.

    • windwalker
    • 6 years ago

    Interesting.
    Are there any laptops I can buy today that support free sync?
    And what does Intel have to say about all of this?

    • Helmore
    • 6 years ago

    Now get me a 27″ 2560×1440 or 2546×1600 IPS display with a 120/144 Hz max refresh rate and FreeSync. I’d prefer it if it won’t cost me an arm and a leg either.

      • psuedonymous
      • 6 years ago

      Look out for Overlord’s offering. They sell those overclockable Korean 2560×1440 IPS monitors, and have stated that they are working with Nvidia on G-sync: [url<]http://overlordforum.com/topic/603-nvidia-g-sync/[/url<]

        • Helmore
        • 6 years ago

        One problem with that. I’m currently on an ATI GPU. My previous one was NVIDIA and who knows what the future will bring, but I would still prefer some vendor agnostic solution. Something akin to FreeSync in other words.

          • brucethemoose
          • 6 years ago

          Lightboost already works on AMD GPUs, so there might be a G-sync AMD hack in the near future.

            • superjawes
            • 6 years ago

            G-Sync is going to use common, existing protocols between the GPU and monitor. “Hacking” that is easy since they just need to know how to activate it.

    • puppetworx
    • 6 years ago

    [url=http://www.pcper.com/reviews/Graphics-Cards/AMD-Variable-Refresh-FreeSync-Could-Be-Alternative-NVIDIA-G-Sync<]According to AMD[/url<] the upcoming DisplayPort 1.3 standard includes variable refresh rates like eDP currently does. Apparently some monitors might already support it. The 60% price premium on a locked-into-NVIDIA G-Sync monitor is looking like a much tougher sell at this point.

      • psuedonymous
      • 6 years ago

      The monitors may support accepting varying VBLANK over DP. What they will actually DO with that is likely buffer the frames and display them at a fixed update rate, and drop any extras. To do otherwise would mean modifying their driver ASICS to do asynchronous updating, and at that point they may as well pay Nvidia for the G-sync controller, as that is exactly what the G-sync controller does.

        • Reuel
        • 6 years ago

        The refresh rate is changed to fit the speed of the GPU on the fly. They don’t need to pay nVidia for that because VESA put it in the standard before nVidia stole it and nVidia doesn’t hold the patent on it anyway.

        • sschaem
        • 6 years ago

        And that controller is built in AMD asic GPUs.

        And clearly AMD mobile asic handle non fix rate, as this tech demo proves.

        This all work for mobile , and AMD already got a working implementation, the issue is having the embedded dp spec carry over for desktop monitor.

        But for mobile kavery, OEM don’t have to license anything, just follow the EDP spec like this existing Toshiba laptop. Its natural and preferable to have the GPU, not a separate asic drive variable refresh display.

        Nvidia just got tired of OEM doing nothing, and decided to take command of the situation.
        We need more move like this from nvidia, and hopefully AMD, to wake up the PC industry… Innovate or die.

      • superjawes
      • 6 years ago

      The big premium on G-Sync right now is the FPGA, which is an expensive component. Finishing the design and manufacturing the final silicon will reduce that premium by a significant margin.

      But at that point it will be very hard for Nvidia to vendor lock the technology to their own cards. If a monitor has the functionality, anyone should be able to tap into it since the cables between the GPU and monitor are not changing. Those were developed by different parties (VESA for DP and HDMI Founders).

      So what upsets me is that Nvidia sounds like they’re going to artificially limit monitor adoption to squeeze more profit, although I’m pretty sure they could increase revenue by selling in higher volumes…oh well…

        • NeoForever
        • 6 years ago

        [quote<]So what upsets me is that Nvidia sounds like they're going to artificially limit monitor adoption to squeeze more profit[/quote<] With current consumers, that's probably the right strategy to make $$. Just wait until the G-Sync Gold monitors come out. Only 10 units ever produced!

      • kilkennycat
      • 6 years ago

      60% price premium? Huh, where did you get that figure ?

      Likely added manufacturing cost for EMBEDDED G-sync:

      ASIC implementation of Altera FPGA $10 (generous).
      RAM and other electronic hardware $20
      nVidia license per unit $10 (est., probably max, since mass embedding of the technology is to nVidia’s advantage )

      Remember that the G-sync implementation will replace some or all of the original scalar functionality, so there will be an unknown-$$ cost-saving.

      Anyway, assume $40 max addition to monitor manufacturing cost. Any further markup will be at the monitor-manufacturer’s discretion, NOT nVidia’s.

        • yammerpickle2
        • 6 years ago

        Let’s not forget buying a new graphics card from Nvidia to have the feature. I’ve been running team green in my last two rigs, but still see this a kind of a Rambus type move where you are working on standard, patent the tech the group is working on first and milk it for profit.

          • kilkennycat
          • 6 years ago

          Well, no worse than having to buy a AMD card to have Mantle, or having to buy a nVidia card to have PhysX or CUDA or nV3D, or AMD to have EyeInfinity…. Need I go on?

          I’m sure that nVidia would only be delighted to license the G-sync techology to AMD, if the price is right, just like Apple vs Samsung (and others) vs Microsoft vs… in the cell-phone/tablet markets. Don’t see much being given away free in that area these days.

          • euler007
          • 6 years ago

          If you have a GTX650 or higher you don’t need to.

          The only people losing big is people with GTX580/570 in SLI that could realistically play at 1440p on the gsync monitors.

      • dragosmp
      • 6 years ago

      AMD conveniently forgot to say that variable refresh rate is optional in DP 1.3.

      No point in saying it now, but if Nvidia didn’t do this whole Gsync publicity stunt
      – we’d never know about Vblank
      – AMD wouldn’t drum they’ve had the capability to do variable refresh rate it since 3 years ago (and hid it for 3 years because they didn’t convince any monitor manufacturers to use it)
      – since the Vblank feature was optional in DP1.3 monitor vendors wouldn’t have implemented it (remember dx 10.1). They will implement it now because they’d look like fools if they don’t.

      Ok, technically Nvidia could have done it differently, they could have put out a new GPU and drum about adjustable Vblank. That would have pushed the display industry forward with no benefit to them, only for the gamer. They played their cards in a way that earns them a few bucks, brand recognition and also lots of publicity in a way AMD couldn’t accomplish (unfortunately) for the last 3 years.

    • Jigar
    • 6 years ago

    In short AMD is on the right path, Great!!!

      • erwendigo
      • 6 years ago

      Mmm… no. You NEED a special monitor with a panel compatible with this technologie (variable Vblank), this eliminates almost 100% of the desktop monitors. This runs with some portatil panels, so you can play with Freesync… with a portatil.

      This is better than nothing, but you needs a “new” monitor for your desktop, or a “updated firmware” of the same model. But this means that you need a new monitor because you, in your house and normally, you willn´t upgrade the monitor/scaler firmware.

      So you’ll have to go to the shop like as with the Gsync technology. The “best”, is that is very probable, a cheaper implementation of the FreeSync in “new” monitors against the more expensive Gsync implementation (with the nvidia scaler).

        • ET3D
        • 6 years ago

        From what I’ve read with g-sync you also need a monitor specially designed for it. Difference is that it’s using a proprietary technology that works with a limited subset of hardware, instead of an open standard. I can’t see where g-sync has benefits.

    • renz496
    • 6 years ago

    still i want to see amd to demo free sync using real games. when can we expect it?

    • wof
    • 6 years ago

    So it works on laptops because the interface is different but the required interface is already there in DisplayPort so G-Sync can work on existing interfaces ??!?! Doesn’t make sense to me.

    As for the part stating that existing scaler asics won’t work I think it would be fine if this only worked at native resolution and that shouldn’t be much of a problem to support.

    G-Sync sounds more and more as lock in scheme to help nVidia sell control logic to display manufacturers which is a new market for them i guess.

    I still appreciate that they’ve brought this to everyone’s attention though.

      • psuedonymous
      • 6 years ago

      [quote<]So it works on laptops because the interface is different but the required interface is already there in DisplayPort so G-Sync can work on existing interfaces ??!?! Doesn't make sense to me.[/quote<] Let me break it down: Desktop path: image in GPU framebuffer - DVI/DP interface chip - DVI/DP cable - monitor panel controller (G-sync controller here) - panel interface (LVDS or MIPI DSI) - image displayed Laptop path: image in GPU framebuffer - panel interface (LVDS or MIPI DSI) - image displayed THAT is why AMD demonstrated freesync on a laptop, rather than a desktop with a separate monitor. LVDS and MIPI DSI are not long-distance protocols, you cannot run them more than a few cm.

        • willmore
        • 6 years ago

        Let me clarify some things for you:

        Desktop path: image in GPU framebuffer – DP interface chip – DP cable – DDM monitor – image displayed

        Also, the variable refresh that is in eDP is headed for DP 1.3 which is slated for release in 1-3 months. nVidia is a member of that group. So, no, nVidia didn’t *need* to come up with a propriatary way to do freesync, they could have just pushed ahead with DP 1.3 and helped the whole community, not just their shareholders.

        I’m getting sick of the “We did a real dick move, but it’s okay, because we’re a for-profit company, so it’s cool, right?” No, you’re still a dick. Maybe your stockholders will love you and forgive you–as will some of your Stockholm Syndrome suffering customers–but that doesn’t make you a good guy.

        To be clear, nVidia did *not* have to do it the way they did, they could have gone the more universally beneficial way and they chose not to for no reason other than making money at the expense of others.

    • Maff
    • 6 years ago

    Its nice to hear someone from the opposite camp admitting that Freesync basically is the same thing.
    Lets hope monitor makers will start to support freesync (or announce they are already), since both camps should be able to use it on any monitor supporting it.

      • Klimax
      • 6 years ago

      I wonder how locked-down code in G-Sync module is…

    • Arclight
    • 6 years ago

    But he had no comment as to wether the result would be different? As in he silently admited that both aproaches would have the same result without one being superior to the other?

    Hmmm

    Personally i’m not bothered enough by tearing to invest in tech but a lot of people are able and want to so i guess the tech is here to stay. I was actually more interested in faster response and the betterment of IPS panels in terms of refresh rate and higher resolutions on normal sized panels.

      • Laykun
      • 6 years ago

      The result is not possible at all since no desktop asic supports variable refresh rates. Hence gsync.

      “To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, “we would know.” Nvidia’s intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction.”

      For free sync to work it would require a new technology on desktop monitors.

Pin It on Pinterest

Share This