Modded G-Sync monitors available in limited quantities

Nvidia’s G-Sync technology is pretty neat; it discards the fixed refresh rate of traditional displays and instead updates the screen as new frames are produced by the graphics card. This synchronization eliminates tearing and stuttering, and Scott was very impressed with the results. There is a catch, however. G-Sync requires a Kepler-class GPU and a monitor infused with custom Nvidia hardware.

The first display to support G-Sync is Asus’ VG248QE. A native G-Sync version is due next year, and modified variants of the exiting model are making the rounds at tech review sites. Scott has been playing with one in Damage Labs, in fact. And now you have a chance to buy one for yourself. Asus and Nvidia are selling a limited number of modified VG248QE monitors through a select group of system builders.

Digital Storm, Falcon Northwest, Maingear, and Overlord Computer will all be offering the G-Sync-enabled display. The Maingear and Overlord links are dead as I write this, but we do have details on the other vendors. Digital Storm is selling pre-modified monitors for $499, and it’ll upgrade an existing VG248QE display for $299. Falcon Northwest is also selling modded displays. However, they’re only available to folks buying complete systems or those who have purchased Falcon rigs previously.

We were told in October than the native G-Sync version of the VG248QE would sell for $399, so the modded screen is priced at a premium. Falcon notes that part of the additional cost can be attributed to the labor associated with the modification process. The G-Sync module also replaces some of the monitor’s existing electronics, which won’t be necessary in the native version.

Unfortunately, the G-Sync kit isn’t sold separately right now. Stay tuned for an opportunity to win one for yourself, though. Scott will also have more to say about the technology soon.

In the meantime, you can check out a new promo video highlighting G-Sync’s benefits. The refresh-matching tech can only be simulated on a conventional display, so there are some caveats attached to the clip. The video has a 60-FPS frame rate; to get the full effect, you’ll need a 60Hz monitor and a video player capable of maintaining that frame rate. You’ll also want to download the 200MB file and play it locally rather than attempting to stream the footage. We’ve tried watching the video on several systems, and we’re still seeing some occasional skipping, but the clip still gets the point across.

Comments closed
    • itachi
    • 6 years ago

    I do hope that when I have the money the will have Ultra HD versions, NON TN (tired of the bad viewing angles on mine), low latency/No ghosting, and 144Hz possibly ? Lol :D. I should stop dreaming… Plus I doubt that there will be single GPU’s good enough to spit enough FPS to make a 144hz worthwhile at this resolution ! (cause I don’t want to go in dual GPU solutions)

      • Airmantharp
      • 6 years ago

      Turn down the settings 🙂

    • Sahrin
    • 6 years ago

    Holy shit…$300? I was afraid it would cost $100. Good God nVidia. Alright AMD,fix this.

    • Krogoth
    • 6 years ago

    I cannot believe the excitement over a band-aid. A decent CRT paired with a video card using a quality RAMDAC design never need G-Sync, however CRTs had their own set of issues.

    We wouldn’t needed “G-sync” and similar solutions if SED/FEDs didn’t get killed off by patent trolls back in the mid-2000s.

    Hurry for another glorious victory of the patent office! Promoting innovating and growth!

      • Airmantharp
      • 6 years ago

      A CRT would still show tearing without V-Sync/G-Sync, and would still stutter and suffer an input lag penalty with V-Sync.

      SEDs/FEDs would be thick, heavy, and suck power.

      The hell’s wrong with you lately?

        • Krogoth
        • 6 years ago

        Have you even look into what SEDs/FEDs offer over LCDs and CRTs?

        They combine the best of both worlds, but the tech got lock-up by patent trolls. Sony and Canon lost interest due to the legal fees and proceed going with more profitable LCD panels.

          • Airmantharp
          • 6 years ago

          Yes, I remember.

          They’d have infinite color gamut and be as fast as a CRT but with perfect geometry and thin like a paneled display (LCD/OLED/Plasma). They’d also be heavy, and be bigger power hogs than CRTs.

          And they’d still show tearing/stuttering if they used the same transmission principles. They’d still benefit from G-Sync.

          So how the hell is that discussion related at all?

            • Krogoth
            • 6 years ago

            Ahem, no. SED/FEDs are lighter and easier on power than CRTs. They still lose to OLEDs and LCDs though. The crowd that cares about color accuracy and refresh speed wouldn’t mind though.

      • UberGerbil
      • 6 years ago

      Scott has seen this in action and says it makes a significant difference. You haven’t but say that it doesn’t. Who am I going to believe?

        • Krogoth
        • 6 years ago

        Band-aids also help a lot when you are trying to heal a large laceration.

        LCDs aren’t designed for speed. They will always be slower than CRTs at refreshing their screens. LCDs took over because they offer a number of other advantages. They have a much smaller profile, consumes less power, produces far less eWaste and have perfect screen geometry.

        I rather wait for better display technology to come around than waste time/money on band-aids that will become obsolete once superior solution comes around.

          • Airmantharp
          • 6 years ago

          So you’d rather have tearing or stuttering than not have either?

          You’re not making much sense. It isn’t a ‘Band-aid’, it’s a re-thinking of the process, and it isn’t limited to LCD technology; we’ll want it on OLEDs too.

            • Krogoth
            • 6 years ago

            Re-thinking? Not really.

            “G-sync” has been done before years ago. Nobody care enough about it to make go outside of R&D circles. The only reason that Nvidia pick up on it is because there’s enough noise about input lag and syncing issues along the videophile world. They see as another marketing pitch to keep FPS junkies fixed on their brand.

            This could have been avoided if VESA forum made it part of the Displayport spec. However, most the world doesn’t care enough about syncing/tearing related issues to make it worth the trouble.

            • Airmantharp
            • 6 years ago

            There has to be a return on investment- but I’m sure you know that. Thing is, it’s good for everything, not just FPS. It is the way it should have been from the beginning.

          • Deanjo
          • 6 years ago

          CRT’s had to have a high refresh rate to offset the fading of the phosphors in between scans. With LCD you do not have that issue as the pixel stays in the same state and brightness until the next scan. So while CRT’s were capable of higher refresh rates, it was a needed necessity to reduce phosphor flicker, something LCD’s never had to worry about. This is one of the reasons why trying to read something from a LCD even with a low refresh rate of 24 Hz (or even lower if the image is static and there is no movement present) is easier then reading something on a CRT @ 85 Hz.

            • Airmantharp
            • 6 years ago

            85Hz was my minimum; anything less and I could see the flicker easily across the room. >100Hz was nicer, though that limited resolution.

            • Krogoth
            • 6 years ago

            CRTs have to constantly refresh the image on phosphates. Where LCD panels only shift crystals when it is needed. LCD they do have their own problem, image persistence. It is when the crystal get too “used” to staying in the same position (similar to image buring on Plasmas). It only happens if you leave a static image on for several days or more. You can usually clear it up by forcing the LCD panel to shift its crystals around rapidly (There are several online test that do this).

            LCD panels do flicker, but the cause is different. It is usually because the backlight or its electrical circuitry is starting to fail (blown caps). It normally happens on aging units and you can repair it with a steady hand and soldering iron (if needed).

      • derFunkenstein
      • 6 years ago

      The perpetually unimpressed: [url<]https://blooki.st/BlookElement/ShowTextPhoto?blookElementId=1425[/url<]

      • JumpingJack
      • 6 years ago

      [quote<]I cannot believe the excitement over a band-aid. A decent CRT paired with a video card using a quality RAMDAC design never need G-Sync, however CRTs had their own set of issues.[/quote<] This is not true, even CRTs showed tearing. This is hard, non-negoiable fact of driving a fixed refresh display with a variable frame rate video source. CRT, LCD, OLEG, it does not matter ... tearing is a logistical fact in this situation. The G-Sync method is the most logical approach to avoiding the worst of both situations. EDIT: The G-Sync method, ironically, would not work with CRT so LCD actually enables this approach.

    • mcnabney
    • 6 years ago

    For multi-display setups, do all of the displays have to have G-sync for it to work?

      • Deanjo
      • 6 years ago

      If you wasn’t to span the picture across them, the answer would be yes.

    • Milo Burke
    • 6 years ago

    Did Nvidia patent this objective? Or merely just trademark the name G-sync?

    Because perhaps AMD can’t have G-sync by name, but they could build their own version under their own name? That could utilize the same monitors? Just like we have SLI and Crossfire.

    Can anyone shed light on this?

      • superjawes
      • 6 years ago

      My guess is that Nvidia is going to ride the exclusivity wave as long as they can, which is only going to last while this technology is in development. Theoretically, AMD could design a way to tap into this functionality now, but the fact that the G-Sync “chip” is an FPGA (meaning it can be reprogrammed), not all of the requirements are there.

      Instead, Nvidia can use that exclusivity to fund development, work out the details, and design the ASICs that will eventually replace what is currently in most monitors. At that point, it’s just a matter of tapping that functionality. The display won’t care what is driving it. As long as it can receive a signal to go into “G-Sync mode,” the GPU (or other device) will be able to control the refresh rate on the fly. That means that AMD, Intel, Sony, Samsung, etc. can all make use of the functionality.

      • Chrispy_
      • 6 years ago

      Fixed vertical synchronisation is as old as the hills, but “sync” implies that “async” is obvious and therefore un-trademarkable or copyrightable.

      AMD, intel et Al will no doubt come up with their own JIT frame-delivery system that doesn’t have the G-Sync branding.

        • JumpingJack
        • 6 years ago

        This ….

        Though Nvidia may patent, and make proprietary, the variable refresh module that replaces the scaler, the concept of a variable refresh rate that is dictated by the control video source is not patentable, it is a concept. Nothing would stop AMD/Intel from developing their own technology or forming a standard, in which case Nvidia would need to support as well in order to avoid shutting themselves out of a new market.

        An example of this was Intel’s proprietary wireless display technology, while it had it’s application it never really caught on. However, a standard developed — Miracast, which works with Android device among others, and Intel now supports (they had no choice).

    • Bensam123
    • 6 years ago

    This is awesome if you can buy kits for people who already own these monitors! I already own one and the idea of rebuying the same monitor you can just change out a module for is rather vomit inducing (especially considering the price I paid for it). I hope these prices aren’t final. I’d definitely buy the module, but not for the price I paid for my damn monitor.

    I suppose there is the little snag of owning a 7870 and Nvidia being haughty-tauts and keeping this tech limited to Nvidia. If Nvidia isn’t careful the AMD version I’m sure is in the works will pop out and instantly obsolete this as I’m sure it’ll be open to both manufacturers.

    • Freon
    • 6 years ago

    I saw the slow-motion three-way comparison Anandtech put up (vsync off, vsync on, g-sync), and with all the reports of how great it is I’m generally sold on the idea. But the price, availability, and exclusivity just absolutely kills it. It’s still way short of what I would want.

    I’m certainly not downgrading to a 1080p TN for this. I only paid ~$335 for my 27″ 1440p Korean PLS that runs at a higher refresh to start (seems good at 90 or 100hz, which is not a half-bad temporary fix both with vsync on and off).

    I’m very hopeful, but still not sold. Especially considering how much I generally pay for my monitors ($500+ for several Dells in the past 10 years) so I try to keep them a long time. This “cheap” PLS I just got is the odd one out, but makes me rethink spending big money on a monitor.

    • alienstorexxx
    • 6 years ago

    i still think that this technology has something missing for it to be better welcome.

    it needs to be an improvement on other aspects, replace something old that’s “bottlenecking” monitors to radically change the future of it, they way it’s built or something about his size/power consumption. it’s been like 5 years since the last monitor i bought and the one i’ve got now, they both were in the same price range in their time, the only difference is the actual one is (backlight) led and one inch bigger.

    i think that it needs to come with a bigger package, not just the gsync itseft, but with something that can replace the actual generic monitor arquitecture, to a higher end for the masses. all in all, this thing can’t be adapted to any current monitor, it actually need this specific slot for it to work and it’s only for games. so why not do a big step?

      • Airmantharp
      • 6 years ago

      The ‘big step’ was DisplayPort; G-Sync is a robust approach at tossing out the archaic parts of the specification and moving on to a panel-centric solution.

      Also, it’s not all about you and what you just bought (or anyone else). I bought a 30″ IPS a few years back, and while I’d love to retrofit a G-Sync module into it, I’m quite happy with what I’ve got tearing and all.

        • alienstorexxx
        • 6 years ago

        hi snoopy, how are you?

      • superjawes
      • 6 years ago

      The monitor is the bottleneck, though, and G-Sync solves said bottleneck. Monitors just refresh 60 times per second (or more), and mismatching that refresh rate can mess up animation.

      And the technology really is in its infancy, and that point cannot be stressed enough. It is very limited right now because, I assume, Nvidia still has to figure out all of the protocol details to activate the G-Sync functionality.

      After a couple years, though, once all of those protocols are figured out and the ASICs are designed, the benefit can apply to ALL monitors and video devices. Yes, games will get a benefit, but movies are shot at 24 FPS, which doesn’t divide evenly into 60 Hz, a common refresh rate for monitors. If a DVD or Blu-Ray player can tap into G-Sync, that doesn’t matter anymore because the display can adjust to the optimal refresh rate of the source material (24 or 48 Hz, depending on the minimum threshold).

      And yes, once anyone can tap the functionality, AMD can design drivers to utilize it as well. It might seem pretty locked down right now, but even if it was open source, with Nvidia working out the hardware now, AMD would be trying to hit a moving target.

    • slowriot
    • 6 years ago

    G-Sync looks very interesting. However I’m not “excited” for it because right now it appears it would require at least a ~$800-$900 investment for me.

    There’s a lot of interesting display technologies floating around right now. I’m very hesitant to make an investment in a good monitor with this in mind. $400 isn’t an amount I want to spend on a TN and 1080P only monitor. But I don’t want to buy a 2560×1440 IPS monitor either because I do want higher refresh rates and G-Sync. And then there’s those 4K monitors that are showing up now. Ultimately I’m left thinking it’ll be at least another generation before I really start getting excited about G-Sync due to pricing and waiting on convergence of features.

    • UnfriendlyFire
    • 6 years ago

    I LOVE partial vendor lock-in. Once you buy one of those monitors, you have to use Nividia’s GPUs if you want to enjoy the syncing benefits.

      • indeego
      • 6 years ago

      Kinda like, oh, buying a motherboard and being forced to buy a CPU that works with it?

        • UnfriendlyFire
        • 6 years ago

        Or, you can think of it as the RDRAM, only available on certain chipsets/processors. But I guess the main difference was that RDRAM wasn’t decisively superior to the competing DDR, where as AMD is limited to software fixes unless if they’re hiding something in their lab.

      • Klimax
      • 6 years ago

      Like with any extension going past specs requiring adapted hardware. (Although is it vendor lock if there is no alternative even in development?)

      There is one question though, is it patented and if not can Intel/AMD do same VBlank variation and with G-Sync accepting it?

    • brucethemoose
    • 6 years ago

    Is there any way to mod these to the Korean monitors? Seeing how the G-sync module costs more than my entire 1440p monitor, I’ll assume they will come down in price…

    Also, I wonder when a hack for AMD cards will come out.

      • Erebos
      • 6 years ago

      Given AMD’s beliefs in open standards, I hope they will propose their solution to VESA (ATI was one of the original founders) and kick Nvidia on the curb for being a jerk (again).

        • shaq_mobile
        • 6 years ago

        this would be tight, to be able to retrofit it to those four korean monitors i have.

        • Klimax
        • 6 years ago

        [quote<]and kick Nvidia on the curb for being a jerk (again).[/quote<] Post made sense until this part. Kick on the curb??? And how is NVidia being jerk? AMD could have proposed this already years ago, nothing... So price for complacency of standard engineers and other vendors is being beaten to this by NVidia.

    • MrJP
    • 6 years ago

    How much?!?

    I know this is a limited volume, prototype sort of thing, but eventually this needs to add no more than $50 to the cost of a monitor to have any chance of making any impact on the market. Being available on a wide range of monitors and preferably GPU vendor-agnostic wouldn’t hurt either, though I don’t see Nvidia going in that direction any time soon.

      • internetsandman
      • 6 years ago

      You’re not gonna get a company like Nvidia producing open source hardware for the first generation of a highly impactful new technology, nor are you gonna get it for a budget price. Either of those things on their own are crazy to ask for, combined is simply not happening

        • brucethemoose
        • 6 years ago

        There was a hack for lightboost on AMD, I bet a hack for G-sync on AMD will come out eventually.

          • Klimax
          • 6 years ago

          Until AMD makes new revision of hardware as this requires support on both sides no hack will work. Also drivers need to be aware. In theory it might be possible to hack, but I suspect it would require massive effort.

        • indeego
        • 6 years ago

        120Hz was a premium for about the first year. Now it sells the same as 60Hz LCDs.

          • slowriot
          • 6 years ago

          Huh? I have yet to see a 120Hz monitor sell for the same price as an otherwise identical 60Hz monitor. 120Hz or higher is definitely still a premium and outside of overclocked 27″ IPS models is limited to TN.

            • auxy
            • 6 years ago

            EIZO says hello!

            [url<]http://www.eizo.de/microsite/fg2421.html[/url<]

            • travbrad
            • 6 years ago

            That monitor is $600 on amazon, for a 23.5″ 1080p monitor. Please tell me how that is not a price premium again?

            You can get a 27″ 1440p IPS display for less than that, let alone a 23.5″ 1080p non-IPS display

            • auxy
            • 6 years ago

            [quote=”slowriot”<]120Hz or higher (...) outside of overclocked 27" IPS models is limited to TN.[/quote<] And the overclocked IPS monitors suffer from motion blur due to slow panel response! ┐(´д`)┌

      • superjawes
      • 6 years ago

      Give it time. Eventually I suspect that the premium for G-Sync monitors will be approach zero because the chip that enables it will just replace the current chips, but we aren’t there yet.

      The current mod kits are FPGAs (not exactly cheap devices), and I suspect that is because Nvidia is still working out how the protocol actually works. After all, DVI and VGA were designed for CRTs that required a contstant stream of information because they constantly had to refresh. LCDs don’t need to constantly refresh, so you don’t necessarily need to throw data at the monitor at a fixed rate, and you also need to carry the control signals that trigger scans and holds.

      And once a that is worked out, they can release ASICs or establish requirements for other people to implement them, which will be much cheaper. And that will be easier for someone else to tap into.

        • Airmantharp
        • 6 years ago

        Exactly- this is the price of a customized solution, not a fully-developed solution that takes advantage of established economy of scale.

        If Nvidia doesn’t piss the whole world off in the process, we might see non-G-Sync monitors go the way of the dodo in a couple of years.

        • alienstorexxx
        • 6 years ago

        wow what a fairy tale, you know that price is related to demand? if most people think is not worth the money, it doesn’t matter if the technology is good or not.
        so, best case scenario, gsync becomes standard and costs the same as current chips. “eventually” could mean either sometime in years to come, or never if the technology doesn’t get the customer support. you can’t just expect this to happen inevitably.
        i remember back in early 2000’s when everybody use to imagine what could happen to games with ageia physx, games like gta, need for speed, etc. it was a great idea but everyone thoughts were on best case scenario, you can tell that it didn’t make it that far.

          • Airmantharp
          • 6 years ago

          Butthurt much?

          You really can’t compare PhysX to G-Sync. G-Sync doesn’t need developer support.

          Worse, what do you think the demand for $120 modules that attach to a single TN monitor is? Don’t you think that there might be some commercial interest in bringing the manufacturing cost down to a reasonable level and raising compatibility and ease of implementation?

          Nvidia will likely be licensing the technology and branding to the same people who make controller ASICs for current monitors, once their project has generated enough interest and market demand, and then the price will plummet as economy of scale is taken advantage of and competition drives prices down.

          • superjawes
          • 6 years ago

          You obviously don’t understand. FPGA stands for “Field Programmable Gate Array.” It basically means that you can reprogram the chip to function differently. Having a degree in Electrical Engineering and done some digital design work, that tells me that this technology is not ready for full production. There is still a lot of work that needs to be done figuring out how to communicate over VGA, DVI, HDMI, or DisplayPort so that the GPU can activate this functionality. Just getting out of the FPGA stage will reduce costs [i<]dramatically.[/i<] In other words, Nvidia is doing a soft launch, and these monitors are essentially development and testing kits. They will serve to show off the technology and fix bugs while Nvidia completes a more permanent design.

          • Klimax
          • 6 years ago

          Look at prices of FPGAs and then come back.
          Example of dev boards (most likely adapted for this)
          [url<]http://www.buyaltera.com/scripts/partsearch.dll/multisearch?site=ALTERA&lang=EN&keywords=EP1AGX50[/url<]

    • tviceman
    • 6 years ago

    I’m excited about this technology, but I’m waiting for it to come to high resolution IPS panels. It might be a year or maybe more, but that’s fine; it can coincide with my 20nm GPU upgrade.

    • DarkUltra
    • 6 years ago

    Was Scott as impressed by g-sync as by 120hz :-p

    – “Wonder in amazement as the 120Hz display produces an easily observable higher fluidity in the animation.”
    techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking

      • Damage
      • 6 years ago

      More so! 144Hz peak + variability to match the GPU output rate is very nice.

        • ClickClick5
        • 6 years ago

        But what is Scott’s maximum level of amazement? Is there an amazement cap? What happens after that cap is reached?

          • indeego
          • 6 years ago

          He melds into a Krogoth.

    • DarkUltra
    • 6 years ago

    As all G-SYNC monitors include an official strobe backlight mode, better than LightBoost, I’m very excited!

    Overlord computers looks to be working on a 1440p 120hz ips version
    [url<]http://overlordforum.com/topic/603-nvidia-g-sync/[/url<] Will g-sync be the first 1440 120hz over DisplayPort I wonder...

      • brucethemoose
      • 6 years ago

      Ok, now I’m interested. Strobed backlight 1440p IPS 120hz G-sync without driver hacks is literally the holy grail of PC displays, and since it’s overlord, you won’t have to sell any appendages.

      • Pwnstar
      • 6 years ago

      Unfortunately, you can’t have both G-SYNC and the new LightBoost at the same time, but that just means I pick LightBoost. It really is that much better. Keep your FPS above 60 and you don’t even need G-SYNC.

        • Klimax
        • 6 years ago

        Still tearing…

          • auxy
          • 6 years ago

          Tearing at 120hz is really not that big of a deal.

          • Pwnstar
          • 6 years ago

          A tiny bit of tearing in exchange for LightBoost is worth it.

    • Shambles
    • 6 years ago

    One step closer to my 4K OLED gsync monitor.

      • Airmantharp
      • 6 years ago

      With a REAL 600Hz max refresh rate :).

    • DPete27
    • 6 years ago

    [quote<]Stay tuned for an opportunity to win one for yourself, though.[/quote<] Do all monitors have this connector/socket to plug in the G-sync module, or just the VG248QE...or do you need a mod board to accept the G-sync card even for that monitor? (I don't often (never) pry apart my monitors to look at what's inside)

      • Dissonance
      • 6 years ago

      It’s my understanding that the mod kit is specifically for the VG248QE.

      • Klimax
      • 6 years ago

      I don’t think any current monitor is extendable. (as in “can accept replacement cards for scaler”)

    • Corion
    • 6 years ago

    I’ll be first in line for one of the new monitors. I’ve never been so excited about display tech.

      • Prestige Worldwide
      • 6 years ago

      Samesies, 144hz + gsync? Shut up and take my money!

      The question is, what do I do with my BenQ XL2420T 120hz monitor afterwards? I feel like it will be hitting the graveyard much before its time…

        • DPete27
        • 6 years ago

        [quote<]what do I do with my BenQ XL2420T 120hz monitor afterwards[/quote<] Give it to me

      • Airmantharp
      • 6 years ago

      TN?

      No thanks. But if one of you more eager gerbils gets one in the North Texas area, I’ll gladly come over and take a look- and maybe shoot some video :).

        • brucethemoose
        • 6 years ago

        And I thought I was the only computer literate person near DFW.

          • Airmantharp
          • 6 years ago

          We’re definitely rare…

      • Aerugo
      • 6 years ago

Pin It on Pinterest

Share This