LG’s 2019 OLED TVs are getting G-Sync certification

For a long time now, the gap between TVs and monitors has grown smaller and smaller. Gaming on a CRT monitor after using a TV was like putting glasses on for the first time. These days, the differences are in smaller details. LG and Nvidia closed the gap just a little more this week, adding G-Sync certification to LG’s 2019 TVs.

LG announced this week that its 2019 and E9 OLED televisions are now certified by Nvidia as G-Sync Compatible through Nvidia’s rigorous testing. With this tech, you’ll be able to plug a C9 or E9 television (of 55- to 77-inch dimensions) into a 20-series Nvidia graphics card and enjoy variable-refresh rates up to 120Hz.

LG G-Sync Compatible OLED TV

To be completely clear, these televisions do not have G-Sync hardware in them. Instead, they fall under the G-Sync Compatible umbrella, sharing it with other variable-refresh displays designed for AMD’s FreeSync tech.

These models already offer variable-refresh rate gaming thanks to their HDMI 2.1 ports, but right now the Xbox One X is the only piece of gaming hardware that supports the variable refresh tech built into the HDMI 2.1 spec. Over on the PC gaming side, the HDMI ports are all earlier specs, and variable-refresh is handled through Nvidia’s G-Sync and AMD’s FreeSync Adaptive Sync technology.

This also doesn’t mean that G-Sync suddenly works through HDMI, either. Nvidia will release new drivers that enable RTX 20-series and GTX 16-series cards to connect PCs to G-Sync Compatible TVs. It’s still DisplayPort for the rest of us over here in PC land. Right now, LG’s OLED TVs are the only G-Sync Compatible TVs.

They’re TVs, too

Along with VRR technology, the televisions feature HDR10 and Dolby Vision and make use of Auto Low Latency Mode. The latter is part of the HDMI 2.1 spec; it switches off the display’s extra processing to make sure gaming applications get the lowest latency possible.

While the gap between TVs and monitors is indeed closing, each category has its own advantages. OLED tech is a rare sight in the PC world and offers awesome contrast and brightness ability. Televisions climb past the 70-inch mark, while monitors rarely go past the 35-inch line.

Monitors, meanwhile, can provide lightning-fast response times and high pixel density. Big gaming displays are growing in popularity, though, with panels like Nvidia’s “Big-Format Gaming” display and Alienware’s 55-inch OLED gaming display already starting to fill out the market.

These televisions are already available, and it stands to reason that we’ll see LG’s 2020 OLED televisions ship with G-Sync Compatible certification out of the box, though that’s not a guarantee. The 2019 models, meanwhile, will get their G-Sync compatibility via a firmware upgrade in the coming weeks.

avatar
9 Comment threads
36 Thread replies
0 Followers
 
Most reacted comment
Hottest comment thread
19 Comment authors
SpunjjipsuedonymousanotherengineerKrogothderFunkenstein Recent comment authors
  Subscribe  
newest oldest most voted
Notify of
anotherengineer
Guest
anotherengineer

Pretty big screens, I guess good for from the couch, too bad there wasn’t something smaller in size and price, and that could fit on a desk.

derFunkenstein
Guest
derFunkenstein

Hallelujah. The ALLM and eARC train is starting to pull away from the station. It’s about time it got moving. Maybe in 2021 or 2022 when I’m in the market for a new TV it’ll be ubiquitous and affordable.

RdVi
Guest
RdVi

I have a C9 65″ TV, but also a 5700XT… Any chance of LG partnering with AMD to do the same? Hell has definitely frozen over when a TV/monitor with adaptive sync supports Nvidia “gsync” and not AMD freesync (except that it does, for Xbox 1X…)

I was planning on upgrading to a higher end GPU once HDMI 2.1 makes it way to GPUs and giving my 5700XT to my wife. I didn’t think something like this would happen TBH.

Spunjji
Guest
Spunjji

I’m pretty sure all LG 2019 OLED models already support FreeSync – that’s the mechanism by which “G-Sync” support is being implemented.

Bagerklestyne
Guest
Bagerklestyne

Ok, so can someone tell me if ANYONE makes a high refresh 4k monitor/tv in the 33-40 inch range (like the Asus XG438Q but smaller)

James Bond
Guest
James Bond

According to the Rtings “best gaming TV” roundup, the Samsung Q70 series, which supports FreeSync, is the “best” variable refresh rate TV, but its smallest variant is 49″. There is a Q60 series below this Q70 series that might have a smaller panel size, but I am not sure if it offers VRR.
https://www.rtings.com/tv/reviews/best/by-usage/video-gaming

Someone
Guest
Someone

“new drivers that enable RTX 20-series and GTX 16-series cards”

For what reason is this limited to only for the newest generation of cards?

Krogoth
Guest
Krogoth

It requires full HDMI 2.0a/2.1 ports. That limits it to Turing.

Someone
Guest
Someone

Geforce 10-series has HDMI 2.0b support and there are no cards I know with 2.1 ports.

https://en.wikipedia.org/wiki/GeForce_10_series

MrDweezil
Guest
MrDweezil

You know the answer to this

mattshwink
Guest
mattshwink

I am going to be getting one of these soon (65″ C9) either this weekend or Black Friday….good news.

Sweatshopking
Guest
Sweatshopking

Make sure to drop it off at my place

doomguy64
Guest
doomguy64

This pretty much signals the death of BFGD, and any future Gsync products. It’s cheaper with better hardware. None of those Gsync is better fallacy arguments hold a scintilla of credibility with these panels. OLED vs LCD? Not even close.

Krogoth
Guest
Krogoth

G-Sync 1.0 was running into bandwidth issues with its module when going beyond 4K at high framerates. It is one of thebig reasons why Nvidia started to implement VESA’s adaptive sync spec on their desktop SKUs.

doomguy64
Guest
doomguy64

Gsync 1.0 was a FPGA chip combined with mCable style processing capabilities. The chip processed vsync off with a validation signal as “variable refresh” because the video cards (kepler) literally did not have variable refresh output hardware. If you could trick a Gsync panel into thinking an AMD card was Nvidia, or mCable released a freesync product, you would have the functionality of Gsync either way. mCable would have a killer product if they did this, as it would be superior to most existing upsamplers, but that window of opportunity is closing and their existing chips can’t handle 1080p/1440p+ upsampling… Read more »

psuedonymous
Guest
psuedonymous

That’s not even close to true. G-sync transport frames asynchronously regardless of ‘version’ (though ‘1.0’ is a completely fan-made designation). The major difference between any of the G-sync FPGA controllers and the the alter DP Adaptive sync exposed commodity panel controllers is that the G-sync controllers handled all frame processing on-board, with no preprocessing done by the GPU. That means all frame-duplication (input frame miss), pixel switching time compensation, etc, is all done monitor-side and transparent to the input signal. Freesync’s implementation of DP Adaptive sync moves this to the GPU side: frames are duplicated before transport (in case of… Read more »

Krogoth
Guest
Krogoth

“G-Sync 1.0” uses the module (Because it predates VESA’s adaptive sync) while “G-Sync compatible” implements VESA’s adaptive sync spec. It only works GPUs and monitors that properly implemented Displayport 1.2 or later. Theoretically, this means Maxwell and newer as seen in their mobile counterparts. It was locked behind drivers with a few “oops” here and there on desktop GPUs until Nvidia dropped the facade almost a year ago. G-sync Ultimate is proprietary non-sense like Freesync 2 which are just adaptive sync with proprietary HDR color spaces (for bandwidth reasons). They only exist because full 10-bit color space and beyond at… Read more »

psuedonymous
Guest
psuedonymous

People are easily confused as “G-Sync” covers both a transport protocol and a panel driver (of which there are at least three different implementations on different models of FPGA, if not more). “They only exist because full 10-bit color space and beyond at 4K under high framerates is still too taxing for Displayport 1.4. “ Utter and complete nonsense. Both G-sync Ultimate and Freesync 2 operate within the exact same bandwidth limitations of DP 1.4(a) (and DP 1.3). All are subject to the same framerate caps at the same resolutions and bit depths (without using DSC or chroma subsampling, both… Read more »

Krogoth
Guest
Krogoth

You can’t have full 10bit coloring and beyond along with ultra-high resolutions (4K and greater) along with high framerates within the Displayport 1.4 spec. There’s not enough bandwidth to go around. You have to make a trade-off somewhere. Freesync 2/G-Sync Ultimate both answered this with their own limited color spaces that try to pass off as “HDR”. One uses drivers/profiles (software) and other uses panel controller (firmware). The bandwidth found in upcoming Displayport 2.0 and new HDMI 2.1 spec eliminate the need for workarounds like Freesync 2/G-Sync Ultimate. G-Sync is intentionally confusing. That’s why informal “G-Sync 1.0” moniker even exists… Read more »

psuedonymous
Guest
psuedonymous

“There’s not enough bandwidth to go around. You have to make a trade-off somewhere. Freesync 2/G-Sync Ultimate both answered this with their own limited color spaces that try to pass off as “HDR”. One uses drivers/profiles (software) and other uses panel controller (firmware).” Nope, both deal with the problem the same way: limiting framerate, or dropping to chroma subsampling. Colourspace also has nothing to do with bandwidth, only bit-depth does (which is independent of colourspace). You could run REC2020 at 6bpp if you wanted (though you;d get horrific banding) or REC709 at 12bpp (as is done for mastering). G-Sync and… Read more »

Krogoth
Guest
Krogoth

Color space does bloody matter for bandwidth. Why do you think “limited color” spaces even exist (MCGA, VGA, EGA, CGA and so on)?

psuedonymous
Guest
psuedonymous

Those are bit-depths (or more specifically, a combination of bit-depths and palettes), not colour spaces. Colour spaces refer essentially to the mapping of encoded colour to perceived colour (usually to L*a*b* space for practicality). Rec709, REC2020, DCi-P3, AdobeRGB, sRGB: all are colourspaces, and all can be expressed at varying bit-depths.

Krogoth
Guest
Krogoth

They are same thing as far as digital graphics are concerned. The amount of bit depth determines what kind of color space you can play around with. Bandwidth and memory capacity concerns place constrains on this. It is simply more expensive to create frames with large bit depth/color spaces. Those aforementioned standards play around with limitations of the current interfaces and color acuity of the human eye(s) with goal of making photo-realistic imagery (as far as color is concerned).

psuedonymous
Guest
psuedonymous

“They are same thing as far as digital graphics are concerned. The amount of bit depth determines what kind of color space you can play around with.”

Incorrect, colourspace and bit-depth are independent. You can can use sRGB at 12 bits per channel if you wished (and you may do during mastering), or do something silly like use REC2020 at 6 bits per channel (holy gradient banding, Batman!).

doomguy64
Guest
doomguy64

Lol, spin doctor. Saying what I say in a different manner. The FPGA was necessary first gen not only because controllers didn’t exist, but because it didn’t exist on the video card either. Nvidia completely bypassed the entire chain by handling “all frame processing on-board” which is exactly what I said in different words. They knew adaptive was coming and wanted to beat AMD to market, so Nvidia literally dropped out of VESA, stole the VRR tech, and implemented it completely self contained in a FPGA chip to beat everyone to market. That’s what happened. Gsync literally had to do… Read more »

psuedonymous
Guest
psuedonymous

“The FPGA was necessary first gen not only because controllers didn’t exist, but because it didn’t exist on the video card either” False. Kepler can quite happily dispatch frames outside of the V-sync interval, as can almost any GPU. If it could notm the G-sync would not be possible, as there would be no way to get asynchronous frames to the monitor in the first place. A panel controller cannot after all travel back in time, so it cannot grab a frame 8ms after the prior one if it needs to wait 16ms for the next V-sync interval. “They knew… Read more »

Krogoth
Guest
Krogoth

Wrong, Kepler can’t handle adaptive syncing on its own because the micro-architecuture predates it (It taped it long before Panel Self-Refresh was officially release). That’s why it needed the FPGA module to handle it. Nvidia was the first to saw the potential of the panel self-refresh for adaptive syncing and didn’t want to wait until VESA finalized its own official spec. That’s why they rush it to the market with “G-Sync 1.0” and get the early adopter crowd. AMD did a knee-jerk response by trying to hastily implement VESA’s adaptive sync and brand it as “Freesync”. It officially came out… Read more »

psuedonymous
Guest
psuedonymous

“Wrong, Kepler can’t handle adaptive syncing on its own because the micro-architecuture predates it (It taped it long before Panel Self-Refresh was officially release). That’s why it needed the FPGA module to handle it. “ This is not true. If you want to go grab a scape and peek the DP link colming from a Kepler card (even linked to an OG G-sync monitor with the original retrofit FPGA kit) you will plainly see that frames are delivered asynchronously. There’s no way to fake that, either you’re delivering frames locked to V-=sync or you aren’t. What the G-sync module does… Read more »

doomguy64
Guest
doomguy64

Boy, you’re real smart. eDP was never implemented in Kepler or Maxwell because it was laptop technology never implemented on desktop, and the simple explanation is what you’ve already admitted. “is all done monitor-side and transparent to the input signal” That means Kepler was simply running VSYNC OFF with a validation signal. Kepler NEVER had eDP or any other tech implemented, BECAUSE IT PREDATED THOSE STANDARDS. Vsync off = asynchronous frame delivery. That’s why Gsync 1.0 had a BUFFER. If you don’t admit that it had a BUFFER, then you are flat out shilling false information on purpose. Gsync 1.0+… Read more »

Krogoth
Guest
Krogoth

Maxwell had it in its laptop flavors. The whole walled garden was blocked by drivers/firmware. It was working by “accident” on desktop SKUs in a few beta releases. Nvidia threw in the towel when modules started to encounter bandwidth limitations at higher resolutions and the marketshare of VESA adaptive sync capable monitors started to outstrip G-Sync monitors by a good margin.

doomguy64
Guest
doomguy64

I don’t know if adaptive was ever working by accident in Maxwell. eDP is a mobile spec that requires an eDP monitor, and desktop parts have always had different output configurations. I think there were rumors of this, but I don’t recall if it ever actually worked. Someone may have either hacked the driver, made a photoshop, or put one of those removable MXM cards in a PCI-E adapter. The only working hack I’ve heard of is using an AMD APU to output video from an Nvidia card through windows 10, and that was right before Nvidia started supporting freesync,… Read more »

doomguy64
Guest
doomguy64
doomguy64
Guest
doomguy64

https://pcper.com/2015/01/mobile-g-sync-confirmed-and-tested-with-leaked-alpha-driver/3/ [quote]Here’s the facts: NVIDIA will release G-Sync on mobile devices without the requirement of a G-Sync module, but the company claims that there will be experience differences between desktop and mobile iterations of the technology. [/quote] This situation was using a mobile chip, which is basically what I’ve said before. It does not work on desktop cards. Nvidia never made Maxwell compatible on the desktop, but their mobile chips were. eDP is a built in feature for laptops, so of course it would work on laptop hardware. That doesn’t mean it was ever included on desktop Maxwell. The laptop… Read more »

psuedonymous
Guest
psuedonymous

“That means Kepler was simply running VSYNC OFF” That was my point, yes. That’s what VRR is: running without V-sync. That’s its entire purpose. “That’s why Gsync 1.0 had a BUFFER. If you don’t admit that it had a BUFFER, then you are flat out shilling false information on purpose.” The buffer was a look-behind buffer, not a frame delay buffer (which is why testing of the gen 1 module showed no additional imput lag over V-sync: https://www.blurbusters.com/gsync/preview2/). “There is no other valid explanation” The buffer is to allow comparison of an incoming pixel value to the previous pixel value,… Read more »

Krogoth
Guest
Krogoth

Wrong, Kepler doesn’t implement anything out of eDP specs. The architecture predates it (late 2000s). eDP was on the drawing boards when Kepler was taping out. VESA adaptaive sync was always thing, but VESA is always ultra-conservative (like most engineering associations) on finalizing new specs. Nvidia saw the marketing potential of it and went ahead of the crowd. This is basic business acumen. One of the most famous examples of this in the computer field is when Apple and Microsoft in their infancy capitalize on Xerox’s Alto potential. The spin-doctoring is getting tiresome. Nvidia didn’t invent the wheel. They simply… Read more »

chuckula
Guest
chuckula

This is a meta-comment about the comments and the new comments upon which I am commenting.

bthylafh
Guest
bthylafh

This is a metameta comment, just because.

superjawes
Guest
superjawes

If we’re doing meta-comment commentary, we might as well comment (and test) the voting system. Keep trying to vote on this one (up or down), and see if we can break the counters again.

bthylafh
Guest
bthylafh

Upvote me while you’re at it so we can check it in the other direction.

superjawes
Guest
superjawes

Both directions are borked.

John LAbate
Guest
John LAbate

This is a Meta-llica comment: Master of Puppets rules!

Krogoth
Guest
Krogoth

I need Healing…..

sweatshopking
Guest
sweatshopking

from what

bthylafh
Guest
bthylafh

He’s got the mehs.

Rectal Prolapse
Guest
Rectal Prolapse

He’s a meh addict?

Pin It on Pinterest

Share This