Nvidia BFGDs behave just like bigger G-Sync screens, and that’s great

Nvidia's Big Format Gaming Displays (henceforth BFGDs) proved to be among the biggest news for PC gamers at CES, both figuratively and literally. I got to see both HP and Asus' takes on the idea while bouncing among the various booths and press events at the show, and from my limited time with both companies' products, I think Nvidia and its partners are primed to succeed in their mission to push high-refresh-rate and variable-refresh-rate gaming out of the bedroom or basement and onto the biggest screen in the house.

HP's Omen BFGD

What's most disarming about and perhaps the greatest success of BFGDs is that they look and feel exactly like the high-refresh-rate G-Sync monitors we know and love, just scaled up. Once I got my hands on the controls, running around in Destiny 2 felt just as responsive and fluid as it does on the gaming monitors we have in the TR labs. Folks who have battled smeary LCDs with high input lag for HTPC gaming in the living room will be smitten with BFGDs, and the benefits of high-refresh-rate panels and low input lag should be evident even with consoles, too.

Asus' ROG BFGD

Because of the trade-show surroundings, neither the Asus nor the HP displays I got to examine were set up in an optimal environment. The mouse and keyboard available for both demos was just inches away from the screen itself, and I probably could have counted pixels on the BFGDs had I squinted. That's not a great way to behold a 65" display, and I'd really have to use one in a living room before passing full judgment on the BFGD idea. Both of the screens I looked at were pre-production models, but even so, folks hoping for OLED-like slimness from these screens should temper their expectations on that front, too. Both BFGDs were quite thick front to back, although both had thin bezels that shouldn't distract in a typical living-room environment.

Whether living-room gamers will bite on the BFGD concept ultimately depends on the prices Acer, Asus, and HP slap on their screens. Nobody was talking numbers at CES, but none of my fellow media expected these screens to land anywhere near the price of even a premium LCD or OLED TV of similar size when we threw out our guesses. We'll have to see how those forecasts play out when BFGDs land on the market later this year. For the pickiest living-room gamers, it may not be easy to go home without a BFGD at any price once they've laid eyes on one of these beastly screens—and that may be the entire point.

Comments closed
    • PrincipalSkinner
    • 2 years ago

    A friend of mine once said “All necessarry light for a human comes from the monitor”.
    The guy from the first photo clearly adheres to that principle.

    • Alexko
    • 2 years ago

    I have a feeling that the F in BFGD originally stood for something quite different from “Format”, and that this is just the public version of the full name.

    And if that is not the case, well, it should be.

    • oldog
    • 2 years ago

    Has anyone compared gaming on these displays versus the new LG OLED displays?

    • Voldenuit
    • 2 years ago

    I know this is an artifact of the shutter speed used in the capture, but it was a bit funny seeing a ghost frame in the lead image.

    • Voldenuit
    • 2 years ago

    I just upgraded from a 27″ 16:9 G-Sync panel to a 34″ 21:9 G-Sync panel, and I absolutely love being able to fill my field of view while playing Destiny 2 at 120 fps. (Don’t worry, the old monitor is doing great as a secondary monitor.)

    • jts888
    • 2 years ago

    I suppose it was too much to hope for that Nvidia would gracefully put the G-Sync wire protocol out to pasture with the advent of both VESA and HDMI VRR becoming standardized.

    And it’s not like the display-side FPGA based controllers couldn’t be field upgraded for VESA Adaptive-sync either. At this point I hope HDMI 2.1 just steamrolls the platform and we can move on from proprietary peripherals.

      • meerkt
      • 2 years ago

      Maybe Nvidia are just using the last year or two they have to monetize G-Sync before the opportunity is gone.

      • EndlessWaves
      • 2 years ago

      Unfortunately not enough graphics card reviews are pointing out this disadvantage of an nVidia cards – that you have to pay extra cost for a g-sync monitor or suffer the inferior smoothness of a non-VRR monitor.

    • Chrispy_
    • 2 years ago

    Pricing!

    G-Sync monitors can be upwards of double the cost of their equivalent size/quality in any given size and large format displays are already plenty expensive enough already.

    Do you like spending $1000 on a large TV? Well you’re in luck, you can now get the exact same thing with reduced gaming lag for just $2399.99 and we’ll even throw in a six foot Displayport cable worth $3.99!

      • EndlessWaves
      • 2 years ago

      1000USD for a 65″ TV with 300+ dimming zones and a 1000cd/m² brightness?

      Even without the nVidia stuff this is a top of the range TV aimed at competing at flagships and -1 models. If you wanted to be optimistic on pricing then expect it to compare with the likes of the Sony XE93/X930E, and it’s more like a 65″ version of the XE94/X940E.

        • Chrispy_
        • 2 years ago

        Sorry, I’m not clued up on US pricing. I just figured $1000 was upper-end of the market. We’d be paying around €1200 but that includes tax.

          • psuedonymous
          • 2 years ago

          $1000 for a 65″ UHD HDR (premium, not ‘HDR’) panel is at the lower end of the market.

            • Chrispy_
            • 2 years ago

            Ouch! Rather you than me.

            [url=https://www.google.co.uk/search?q=UE65MU7000&source=lnms&tbm=shop&sa=X&ved=0ahUKEwi4j9iG0vXYAhURDOwKHTqVAogQ_AUICigB&biw=1920&bih=910#spd=6670948882186503249<]Under £1000 without tax[/url<] (tax is 20%) for a Samsung 65" UHD Smart TV with HDR1000. I'm not really that knowledgeable about TVs. Is that considered high-end or not? It seems to have all the fancy features and it's from a big name brand. The equivalent specs from off-brand seem to start at [url=https://www.google.co.uk/search?q=Hisense+H65N6800&sa=X&biw=1920&bih=910&tbm=shop&tbs=p_ord:p&ei=_yJrWof7KND7kwXXxL7wCQ&ved=0ahUKEwjH9qa21PXYAhXQ_aQKHVeiD54Quw0I0AEoAQ#spd=14048022192718050394<]£750 before tax[/url<].

    • odizzido
    • 2 years ago

    You mean bad? Gsync sucks hard.

      • Thresher
      • 2 years ago

      I just upgraded to a GSync panel and I disagree with this completely.

      However, the lack of inputs on GSync monitors is freaking ridiculous. I don’t know if there will be a GSync 2 any time soon, but two inputs on a $750 monitor is nuts.

    • EndlessWaves
    • 2 years ago

    ” and the benefits of high-refresh-rate panels and low input lag should be evident even with consoles, too.”

    So NVidia have made a statement that they won’t support variable refresh rate and backlight strobing on consoles? It seems like absolute suicide to me to cut out the feature that would make the greatest difference to your biggest market.

      • DPete27
      • 2 years ago

      You do know that both the PS4Pro and XB1X contain AMD GPUs, right? And that GSync is proprietary Nvidia technology, right?

        • EndlessWaves
        • 2 years ago

        No, G-sync is a brand.

        Desktop g-sync monitors do use custom nVidia monitor electronics, but G-sync laptops use something different. Reportedly a standard feature of eDP.

        There’s nothing stopping nVidia from supporting a third set of technologies under the G-sync banner that would allow support for variable refresh rate and backlight strobing on consoles, maybe even on games streamed from phones and tablets too.

    • Kretschmer
    • 2 years ago

    Any word on strobing? That’s the most exciting gaming monitor feature right now.

    Also, if these behemoth displays end up $<700, why are 27″ or 34″ GSync monitors so damn expensive?

      • Redocbew
      • 2 years ago

      I can’t imagine that they would be. These are almost certainly going to be in the “if you have to ask” territory at least at first. It will be interesting to see what the pricing is though since the whole point here is that Nvidia wants you to buy one of these instead the next time you’re shopping for a TV. In that way they might not follow the pricing we see on monitors at all.

    • DPete27
    • 2 years ago

    Jee, ya know what could benefit from tech like this…..CONSOLES…..oh wait, who’s GPUs are inside current-gen consoles…..AMD. It’s not like it took some stroke of genius to come up with the idea of VRR and low input lag on a TV.

    Get with the program AMD. Nvidia already pioneered* the idea of VRR with GSync, all you had to do was take it 1/2 a logical step further.

    /rant

    *Not sure if they were the first to come up with the idea, but I know GSync was first to market.

      • djayjp
      • 2 years ago

      It’ll be coming to TVs with HDMI 2.1. Just gotta play the waiting game….. Likely will be good to go by next gen (1-2 years from this fall).

      • freebird
      • 2 years ago

      Oh, you mean like this? It’s call HDMI 2.1

      [url<]https://techreport.com/news/31728/project-scorpio-will-support-freesync-2-and-hdmi-2-1-vrr[/url<] [url<]https://images.idgesg.net/images/article/2017/12/featuresupport-100743741-orig.jpg[/url<]

        • DPete27
        • 2 years ago

        I do recall, but that article was referring to the console side of the equation. One could assume that TVs w/ HDMI2.1 would come with FreeSync, but it’s disheartening to not see any news about such products in the pipeline.

        I do also worry about what the state of FreeSync on TVs will be when it first comes out. Will we see similar comparisons to monitors where GSync has tightly controlled premium specs (at premium prices of course), and although FreeSync will have the same capabilities, manufacturers will take the cheap route to have a me-too product on the market? (ie VRR 48Hz-60Hz and same input lag as current TVs)

      • Goty
      • 2 years ago

      I feel like it’s not AMD’s fault that what is essentially the only connection present on consumer televisions is one that does not support VRR in its current implementation.

        • RAGEPRO
        • 2 years ago

        Freesync over HDMI is already 100% functional on many displays, even those who don’t explicitly support it. My LG 24UD58-B only talks about Freesync over DP, but it works on HDMI.

          • Goty
          • 2 years ago

          So it works with no work needed by the display vendors? Somehow I doubt it.

      • jihadjoe
      • 2 years ago

      AFAIK current consoles are pretty good at avoiding frame drops and input lag, but they do it the opposite way from VRR: Dynamic resolution + scaling.

      Output is always scaled to 1080P (or 4k), and then witchcraft in the console determines what the maximum resolution is in order for a frame to make it out within a target time for the scene currently being rendered.

        • Voldenuit
        • 2 years ago

        [quote<]AFAIK current consoles are pretty good at avoiding frame drops and input lag, but they do it the opposite way from VRR: Dynamic resolution + scaling[/quote<] Nah, they do it with 30 fps.

        • DPete27
        • 2 years ago

        Yes, but I wonder if adding VRR to loosen the severity of witchcraft that needs to take place could open up new improvements.

          • jihadjoe
          • 2 years ago

          No doubt some games would benefit from it. The mostly single-player FPSs like Skyrim, for example.

          IMO the main reason consoles went with Dynamic Res instead of Variable Refresh is because of stuff like fighting games, where the entire game mechanic is locked to the 60fps/60Hz refresh. When move startup/active/recovery is directly measured in number of frames, and sometimes getting inputs on an exact frame timing is crucial, Variable Refresh will likely do more harm than good.

Pin It on Pinterest

Share This