Nvidia BFGDs take low-lag G-Sync gaming to the biggest screen in the house

Nvidia's G-Sync tech powers smooth PC gaming experiences on desktop gaming monitors, and its Shield Android TV set-top box delivers gaming and streaming content to the big screen using a powerful SoC. It's only natural, then, that the company is announcing an initiative to bring PC gaming, G-Sync, big screens, and Shield-powered entertainment together in the living room.

Asus' BFGD

Called BFGD, for Big Format Gaming Displays, these 65" 4K screens offer HDR support and 120-Hz variable refresh rates. Their full-array local-dimming backlights promise 1000-nit peak brightness levels, and quantum-dot enhancement films in those backlights promise full reproduction of the DCI P3 color gamut for rich color with cutting-edge content.

Acer's BFGD

Although BFGDs have an Android TV set-top box built in for video streaming and games alike, Nvidia hammers home the point that BFGDs are ideal for PC gaming on a giant canvas, thanks to both G-Sync tech and especially low input latency—comparable to that of a G-Sync desktop monitor. The company didn't talk milliseconds, but it does claim that the average TV has over three times the input latency of PC monitors and BFGDs.

HP's BFGD

Finally, the company notes that when it's time to grab some popcorn and watch a movie on BFGDs with the built-in Shield, that device will use G-Sync to ensure that the content being played rolls by at its intended frame rate with no interpolation, be that at 24 Hz, 48 Hz, 23.976 Hz for broadcast media, or practically any other source frame rate.

Nvidia is working with Asus, Acer, and HP to deliver the first BFGDs later this year. We'll be visiting Asus' suite at CES this week and hope to catch a glimpse of the company's BFGD there. Stay tuned for more details.

Comments closed
    • gerryg
    • 2 years ago

    The Acer one looks like a little robot. Reminds me of one of the Star Wars TOS robots.

      • GrimDanfango
      • 2 years ago

      About time someone posted about the *important* revelation in this article.

      Just look at its adorable wittle feet!!

      (Edit: made even better because you *know* they’ll market this as a brooding “xtreme gamer” moody reds-and-blacks product :-P)

        • Voldenuit
        • 2 years ago

        Codename: [url=http://starwars.wikia.com/wiki/GNK_power_droid<]Gonk[/url<].

    • djayjp
    • 2 years ago

    As far as I know, the only gaming console with explicit VRR (freesync) support is the Xbox One X. No doubt the next generation of machines will have this feature as well. *But*, to be useful, Sony and Microsoft will have to require an unlocked framerate instead of the horrible vsync 60-30-60fps of most current games. Nearly guaranteed smooth gaming on console at last? Or will this just allow game makers to slack off in optimizing games so that they’ll run at an unstable framerate…?

    • EndlessWaves
    • 2 years ago

    It says something about Nvidia’s business practices that we’re assuming that the new features will be Geforce locked despite it not actually being stated anywhere.

    Can anyone actually confirm that they’re not going to allow this to be supported by other hardware such as game consoles?

      • DPete27
      • 2 years ago

      I think it goes without saying:
      GSync is locked to Nvidia GPUs in desktop GPUs because it’s proprietary tech.
      XB1/PS4 have AMD GPUs…..

        • EndlessWaves
        • 2 years ago

        Chromecast is proprietary tech too, yet google allows apps to support it on iOS and Windows rather than locking it to Android. I’m not asking whether nVidia are going to open source it and give it away for free (hah!), just whether they’ll allow manufacturers of other gaming systems to buy a licence to support it.

        Unlike on PC they’re in a minority position here, and being able to show the benefits to the traditional 65″ TV buyers would substantially increase it’s attractiveness.

          • stefem
          • 2 years ago

          And that with G-Sync would be relatively easy since all the work is done in the module, what’s lacking is just a license

    • DPete27
    • 2 years ago

    I hope this crashes and burns. GSync needs to go the way of the dodo, not get pushed into further product stacks by deep pockets.

    • floodo1
    • 2 years ago

    Yes Gsync is more expensive but displays featuring it generally have lower input lag compared to their FreeSync counterparts and the Gsync module allows overclocking of the LCD refresh.. Compare Acers X34 to the XR341ck as just one example of how Gsync is superior to FreeSync in practice

      • DPete27
      • 2 years ago

      The panel is the panel, having a GSync module connected to it doesn’t make the panel any better.

        • stefem
        • 2 years ago

        It’s the display scaler that cause the lag, not the panel

    • DragonDaddyBear
    • 2 years ago

    Maybe this will get TV makers to push the HDMI 2.1 spec with VRR sooner than later. Helloooooo VRR with my XBone One! (A gerbil can dream)

    • Neutronbeam
    • 2 years ago

    I’m curious if a year from now BFGD will be a BFD.

    • The Egg
    • 2 years ago

    I still don’t see why someone producing a G-Sync display couldn’t also make that same display compatible with Freesync. Yes, you’d be paying the G-Sync tax, but the display would be vendor agnostic for very little (or maybe even nothing) in addition to what you were already paying to have the G-Sync.

      • cmrcmk
      • 2 years ago

      My guess: contractual obligations.

    • derFunkenstein
    • 2 years ago

    Sigh. I hate seeing Nvidia double down on this proprietary VRR tech. I think just about everyone looks for the day where you can just buy a VRR display and use it with whatever GPU you wanted to buy in the future.

      • strangerguy
      • 2 years ago

      Imagine you are a TV OEM and the bean counters told you cooperating with NV will make you more money than not, and you would choose the latter because?

      To anybody is offended I’m sure you are also the upright “for the people!” hero asking your boss to give you lower pay for the greater good for all.

        • derFunkenstein
        • 2 years ago

        well duh, I’m not blaming the TV makers for this mess. I very clearly said Nvidia.

    • NTMBK
    • 2 years ago

    “Big Format”… [i<]sure[/i<], that's what that stands for. Cool tech, but I wish that they would get over themselves and support FreeSync.

      • Chrispy_
      • 2 years ago

      I blame the HDTV industry for dragging its heels. The VESA VRR standard has been around for over three years now yet there aren’t any Freesync or VRR-compliant televisions in the market. LG make a 43″ [i<]monitor[/i<] with freesync but they haven't bothered adding the feature to any of their televisions, which is a shame, since every console except the Switch (which is just a tablet and not a console to me, anyway) has a Freesync-ready GPU on board, just itching for mainstream Freesync televisions. Nvidia will never support Freesync. I've accepted this fact now, we just need AMD to get production volumes up. Shortages of Polaris and Vega combined with their suitability for mining means that regular folk are paying a huge premium on GPUs which kind of mitigates the price advantage of Freesync :\

        • odizzido
        • 2 years ago

        HDMI 2.1 includes VRR support. Nvidia will have to really start hurting themselves to continue their proprietary march. I am kinda curious if they will just give up on HDMI or not. Seems like a bad move to me but I am biased as I hate gsync.

          • Chrispy_
          • 2 years ago

          I don’t hate G-Sync; With the exception of the cost it is functionally identical to Freesync and at least a G-Snyc screen guarantees LFC, unlike some (admittedly much cheaper) Freesync options.

          I would have thought that HDTV manufacturers would be falling over themselves to adopt adaptive refresh rates – the technology has already been worked out so they just have to add it to a TV to say that it has perfect 1:1 refresh support for all the weird refresh rates that movie buffs like to whine about like 24Hz, 23.976Hz, 25Hz, 29.97Hz etc.

            • psuedonymous
            • 2 years ago

            [quote<]they just have to add it to a TV to say that it has perfect 1:1 refresh support for all the weird refresh rates that movie buffs like to whine about like 24Hz, 23.976Hz, 25Hz, 29.97Hz etc.[/quote<]Any even vaguely half-decent TV for the last decade or so has already had this capability. VRR is worthless for using a TV [i<]as a TV[/i<], as no AV sources output VRR content, and screws up all the perceptual processing TVs do that require frame buffering and look-behind and look-ahead and perform temporal operations (where varying the framerate would cause filters to behave differently). PC gaming is currently the only time where VRR has any value, so you need your TV to be sold as a Pc-connected device rather than as a TV for it to add any value.

            • NoOne ButMe
            • 2 years ago

            How many PS4/XBO units are in NA/Europe?

            That’s a large market to be able to get a decisively win over competitors. sure, they would rush to fill the gap, but as first mover you probably could be “the TV” to get?

            • NTMBK
            • 2 years ago

            How many PS4/XBO owners are looking to buy a new TV? The ones who really care about tech have probably already upgraded to 4K HDR to go with their PS4 Pro/ XB1X.

            • DPete27
            • 2 years ago

            Yes, they missed the boat, but the ship hasn’t sailed. There are still plenty of consolers out there who haven’t upgraded yet.

            I agree with Chrispy. Why isn’t AMD working with MS/Sony to push for FreeSync in TVs? It seems like such a simple sell that would pay dividends in the long run because then Nvidia would be facing a HUGE uphill battle to get into next-gen consoles AND FreeSync would explode in popularity. Perhaps this is the kick in the pants that AMD needs to push FreeSync into TV-land.

        • freebird
        • 2 years ago

        Nobody was going to put VRR in a TV without it being supported over HDMI; which now has it in HDMI 2.1
        [url<]https://images.idgesg.net/images/article/2017/11/hdmi-version-100743089-large.jpg[/url<] Maybe late this year or next I'll think about getting a quality HDR 4K TV, if it has VRR also.

        • xeridea
        • 2 years ago

        Mining has sold out Nvidia cards also. Newegg has pretty much 0 stock of 1070(TI) and 1080(TI), unless you want to pay $150-$300 premium. 1060s go for about $50 premium.

      • stefem
      • 2 years ago

      The phrase “NVIDIA” should support FreeSync” doesn’t even makes sense, maybe you mean that they should support Vesa adaptive-sync. FreeSync it a trademark of AMD that refer to the solution they implemented in its driver to make VRR actually usable

    • Airmantharp
    • 2 years ago

    Dang, and I just bought an LG OLED…

    Though I think these might not come with OLED panels, and might be just a tad more expensive 😉

      • demolition
      • 2 years ago

      No, not OLED unfortunately. They mention quantum-dots and that is an LCD tech. It might still not be bad though and if the focus is gaming, then it would probably be the best compromise as OLED wear is still a thing.. While not an issue for movie watching, it could be an issue with some games that have static visual items in them.

      Using G-Sync for movies is also a good idea although many decent displays will already display the common 24/25/30/50/60 Hz refresh rates without interpolation, e.g. by playing back 24 Hz content at 96 Hz etc. And I am not talking about ‘intelligent frame creation’ and similar inventions as these horrible effects should always be switched off, but simple 3x or 4x frame duplications.

      Now, if they could do ULMB with movies, then that might be interesting. Normally, you cannot combine G-Sync and ULMB, but in this case the frame rate is constant so it might be a different story.

        • Firestarter
        • 2 years ago

        movie framerate is static, adaptive sync doesn’t do anything. ULMB might be interesting though since it sort of requires a static framerate for best results, but for the benefits of ULMB to outshine the drawbacks the framerate has to be decently high. I don’t think we’ll see 100+ FPS film content anytime soon outside of maybe VODs of competitive gaming

          • renz496
          • 2 years ago

          from my understanding you don’t need high refresh rates to use ULMB. just that in case of Gsync user they use ULMB instead of Gsync at much higher frame rate because Gsync effect is minimal at high frame rate.

            • stefem
            • 2 years ago

            You need high refresh rate to avoid flicker

        • EndlessWaves
        • 2 years ago

        The current QDEF is LCD tech for obvious reasons but I’m not sure quantum dots as a whole will be. QDCF seems like it could be used for OLEDs.

      • albundy
      • 2 years ago

      you cant compare. OLED gaming is just that sweet.

Pin It on Pinterest

Share This