HDMI 2.1 spec has support for VRR, 10K resolution, and Dynamic HDR

HDMI 2.0 ouputs and their support for 3840×2160 resolution at 60 Hz are still not ubiquitous even on brand-new PCs. As an example, the UHD Graphics unit included in Intel's latest eighth-generation Core processors needs help from a converter chip to achieve that resolution and refresh rate. The HDMI Forum is nevertheless charging ahead and has released the HDMI 2.1 specification. The new spec includes support for resolutions up to 10K (10240×4320) at refresh rates up to 120 Hz. The mandatory Ultra High Speed HDMI cable can now carry up to 48 Gbps and allows for Display Strem Compression (DSC).

Among other features, HDMI 2.1 also supports variable refresh rates, an automatic low-latency mode, and HDR metadata. HDR support is extended from a using a single image descriptor for an entire piece of content to a new Dynamic HDR mode that can use a different image descriptor optimized on a screen-by-screen or even frame-by-frame basis. The new standard further supports next-gen color spaces like BT.2020 for future HD content.

The new Ultra High Speed HDMI cable uses the same Type A, C, and D connectors as its forebears, but offers increased 48 Gbps bandwidth compared to the 18 Gbps possible with existing HDMI 2.0 cables. The new Enhanced Audio Return Channel (eARC) is backwards-compatible with the existing ARC and has support for uncompressed and object-based audio formats like DTS:X and Dolby Atmos.

Much like FreeSync and G-Sync, the variable refresh rate tech baked into HDMI 2.1 should help decrease input latency and eliminate frame tearing. The new Quick Media Switching feature allows changes in refresh rates and resolution without the customary and annoying display blackout. The new feature list also includes Quick Frame Transport, a reduced-latency mode for games, mixed reality, and karaoke. An Auto Low Latency mode allows source units to establish the ideal latency settings for different types of media, too.

If you're looking for more information, check out the HDMI Forum's Release Presentation. The document touches on the spec's main points without getting too bogged down in implementation details.

Comments closed
    • kamikaziechameleon
    • 2 years ago

    Wait, is this changing anything in the current cable standard? Or will most HDMI cables be fine.

      • willmore
      • 2 years ago

      Yeah, this seems like a very important question that didn’t even get hinted at.

      • willmore
      • 2 years ago

      Okay, according to: [url<]http://www.theregister.co.uk/2017/11/30/hdmi_2_1_specification/[/url<] (who often get stuff horribly wrong) the spec requires new cables. The good news seems to be that they will at least be *backwards* compatable. So, you can still do 1080p on your fancy new cable. How messed up would it have been to not be backwards compatable?

    • Kretschmer
    • 2 years ago

    Coming to a TV near you in 2025!

    • hasseb64
    • 2 years ago

    What? They are ahead of market??
    Okey, so this is the last upgrade right? No need for higher data, ever!!

      • LostCat
      • 2 years ago

      It is hard to see a need beyond this, but we will likely want higher refresh rates at some point.

    • mcarson09
    • 2 years ago

    So how will it take for GPUs to be able to power 10K at 200fps?

      • Ummagumma
      • 2 years ago

      Your question begs the question:

      Is this technology simply a spec (meaning it’s “just words on a page” backed by some fancy math and engineering documents somewhere), or was it really tested in multiple labs by multiple 3rd party test companies and proven to perform at these levels?

      Nothing in the press release hints at this announcement as having any actual physical working technology behind it that consumers can purchase TODAY.

      All of this has me wondering if HDMI isn’t taking a page out of the “high-end hype” …err… “high end hi-fi” business.

    • willmore
    • 2 years ago

    In that resolution@framerate picture, do the asterisks apply to all of the entry they’re on or just the last one? For example, does 8K@100 need DSC or just 8K@120?

    Why group 8K48/50/60 separately from 8K100/120 unless the * applies to both 100 and 120. Then again, why split 10K48/50/60 from 10K100/120 if they both need DSC….

    That’s a confusing chart.

      • Shobai
      • 2 years ago

      And why would 10k/60 need DSC when 10k/100 apparently doesn’t?

        • willmore
        • 2 years ago

        Exactly, the chart is strange.

      • mark625
      • 2 years ago

      Yeah, I bet the column headers have been clipped from that graphic. Maybe the left column is for 10-bit HDR, and the center is for 8-bit (non-HDR). In that case, the asterisk might only apply to the last frame rate in that column.

      Otherwise, I would guess that anything over 8K/60 requires DSC, and having two columns is just a spurious marketing artifact.

      Not a video engineer, just guessing.

        • willmore
        • 2 years ago

        In the absence of facts all guesses are equal, but I think yours are good ones.

    • willmore
    • 2 years ago

    What’s [quote<]Display Strem Compression (DSC)[/quote<]?

      • UberGerbil
      • 2 years ago

      Something DP is already (optionally) using [url<]https://en.wikipedia.org/wiki/DisplayPort#Display_Stream_Compression[/url<]

        • willmore
        • 2 years ago

        That wikipedia page lacks any mention of Strem.

      • Shobai
      • 2 years ago

      That sentence exemplifies DSC; the compression algorithm condensed the ‘a’ out, while maintaining legibility.

        • willmore
        • 2 years ago

        So, I’m the only one who sees the ‘artifacts’? πŸ˜‰

          • derFunkenstein
          • 2 years ago

          I certainly didn’t, because I saw nothing wrong with it until you questioned UberGerbil’s response.

      • PrincipalSkinner
      • 2 years ago

      Obviously, Strem is a compressed Stream.

        • caconym
        • 2 years ago

        *applause*

    • odizzido
    • 2 years ago

    Nvidia CEO is like fuuuuuuuuuuu

    • MrDweezil
    • 2 years ago

    I spent several hours over the thanksgiving break running a new 35′ hdmi cable through my walls and floor since the old one wouldn’t pass HDR. Can’t wait to do that again.

      • cygnus1
      • 2 years ago

      I take it you have no set top boxes of any kind around the TV? All the gear in another room/closet with a Logitech Harmony or some other RF based remote?

        • MrDweezil
        • 2 years ago

        IR extender instead of RF, but yes.

          • cygnus1
          • 2 years ago

          Gotcha. I’m always curious about what other folks do for home theater setups. Are you RF extending all the original remotes for all the devices or do you use something like Harmony to get everything on 1 remote?

            • MrDweezil
            • 2 years ago

            The cheapo Harmony 650 (less than $50) does the job for me. It only needs to control the TV (looking at it, no need for IR extension there), receiver, and cable box. There’s an Xbox and a Chromecast in the cabinet too, but the Harmony just needs to switch receiver inputs for those since the Xbox can be controlled/powered on with its own wireless controller and the Chromecast is controlled by your phone over WiFi.

            • cygnus1
            • 2 years ago

            Ahh, forgot about the IR based Harmony’s. I have the basic one that goes with the Harmony hub, the one with no screen, which is RF. I guess either way gets the same task done, boxes in another room. I’m a fan of not needing to point the remote at anywhere specific though. It makes for a shocking improvement in usability.

      • Spunjji
      • 2 years ago

      I hate to be that guy, but can’t you just tie an end of the new cable to one end of the old cable and then pull it through from the other? I imagine there must be other complications involved here, mind.

        • cygnus1
        • 2 years ago

        Would totally depend on the route his run takes. If it goes through holes in studs or had to be secured in some way in the ceiling or wall, it can be nearly impossible to use the existing wire as a pull cord for new wires.

        • MrDweezil
        • 2 years ago

        The old cable was there when I bought the house and whoever installed it saw fit to put zip ties every 2 feet that tightly bundled it together with various speaker/ethernet cables that I wanted to keep in place. Pulling just one thing out of there wasn’t going to happen.

          • UberGerbil
          • 2 years ago

          “Nice and neat because it’ll be there forever,” was the thinking, I’m sure. I have a friend who was planning to do something similar while he had a wall open and I convinced him to run conduit instead. He’s already thanked me when he discovered he wanted to pull another cable (“I’m never going to put the subwoofer over there!”… “The wife wants me to put the subwoofer over there”).

      • willmore
      • 2 years ago

      It’s called conduit. You might want to look into it. πŸ™‚

      • End User
      • 2 years ago

      I feel your pain. I had to upgrade my HDMI cable to get Dolby Vision and that was just for an Apple TV sitting next to my HDTV.

      • the
      • 2 years ago

      Only 35 ft? I just ran half a dozen shielded CAT6 through the walls of a former casino for HDBaseT runs. Finally replacing ancient RGBHV runs that look to be decades old.

      • tuxroller
      • 2 years ago

      Depending on the exact layout you might be able to just tie one end of the old cable to the new cable and pull from the other end.
      Works for ethernet.

      Edit: nevermind…. Thanks spunjji!

    • Sahrin
    • 2 years ago

    Sad to see HDMI passing DisplayPort. πŸ™ Patent Encumbered < Royalty Free

      • jihadjoe
      • 2 years ago

      And yet [url=https://www.amazon.com/AmazonBasics-DisplayPort-Cable-Feet/dp/B01J8S6X2I<]Displayport cables[/url<] commonly cost 50-100% more than [url=https://www.amazon.com/AmazonBasics-High-Speed-HDMI-Cable-1-Pack/dp/B014I8SSD0<]equivalent HDMI[/url<]. Economies of scale > royalty free.

      • UberGerbil
      • 2 years ago

      The next version of DP was supposed to be out a while ago, at least as a spec; IIRC it was expected to at least offer 8K resolution at 120Hz (or lower res at higher rates or higher res at lower rates). If they switch from 8b/10b encoding to 128b/130b the same way PCIe did going from 2.0 to 3.0, they could pick up ~18% added bandwidth “for free” on existing cables. That won’t catch HDMI 2.1 on its own, but it would help.

    • End User
    • 2 years ago

    “10K resolution”

    Finally!

      • Scrotos
      • 2 years ago

      It’s over 9000!!!

        • DoomGuy64
        • 2 years ago

        Literally!

        • willmore
        • 2 years ago

        It’s really 8K…… Hey, you know, that actually works here, too. πŸ˜‰

        • End User
        • 2 years ago

        That should be the sales slogan for 10K.

      • Krogoth
      • 2 years ago

      It is practically useless for computer monitors though. It is only good for massive screens (billboards/theaters) which is why it is part of the HDMI spec.

        • the
        • 2 years ago

        For bill boards, it is likely a 10K source will be split into multiple subpanels that only reaches such resolutions due to aggregation. The front end source won’t know the difference but the backend is rarely one physical setup.

        For projectors, ultra high resolutions is a major selling point. The actual theater market can by pass some of the cabling limitations of ultra high resolution as the playback engine can be embedded into the projector themselves (hello DRM). Though internally they have the option of breaking down a 10K file into four 5K streams so that DSC wouldn’t come into play and the external user wouldn’t even know. Externally for many projector installations, they are going to run into the cable length problem as 10K looks to only be provided by source physically close to them. SDI has a 48 Gbit version in the works which can provide similar resolutions at great distances but lacks copy protection. HDbaseT has the necessary DRM to encapsulate HDMI but the version of the spec that’ll support beyond 4K60 will require multiple cables or fiber. The other factor for projectors is that while high resolution comes with a premium, that industry is busy transitioning toward lasers which commands another premium on top of that.

      • designerfx
      • 2 years ago

      no worries, I’m sure we’ll have graphics cards that can handle this in ~5-10 years.

        • Krogoth
        • 2 years ago

        4K is already near the limit of human visual acuity on small screens aka computer monitors and smaller HDTV units.

        These resolutions only make sense on massive screens.

    • Chrispy_
    • 2 years ago

    All I care about is Freesync televisions.

    I want to play on the sofa with my 60″ TV but I don’t move from my desktop at the moment because I’d be giving up Freesync.

      • Duct Tape Dude
      • 2 years ago

      Yeah but then you need to buy a new TV. Instead, what if you bought a new GPU powerful enough that Freesync is moot?

        • Spunjji
        • 2 years ago

        That latter benefit goes away more swiftly than the former. Also in my experience VRR is never moot no matter how powerful the setup (I had GTX 980s in SLI for a while)… I hate V-Sync though.

          • Chrispy_
          • 2 years ago

          Can’t argue with that. DTD, you sound like someone who hasn’t really used VRR yet.

          Even though 60Hz vsync with zero missed frames is pleasantly smooth, it’s not even close to the smoothness or low-latency that Freesync/G-Sync offer.

          VRR is something you don’t really notice when you first start to use it, but then you go back to 60Hz with regular vsync and wonder how you even put up with all the hitches and inconsistent animation smoothness beforehand.

            • LocalCitizen
            • 2 years ago

            i don’t have vrr, but i’m wondering how does vrr compare to 120hz vsync?

            • Duct Tape Dude
            • 2 years ago

            I’ve done a 290X@48-90Hz Freesync w/LFC, and can tell you a 1080Ti@100Hz Vsync is more pleasant. Yes, there are minute latency nuances and I can feel a difference with and without Vsync, but Just Cause 3 skill courses tell me I perform better with more, constant frames anyway. Like displacement, there’s no substitute for more frames.

            I appreciate your assumptions though.

            • Chrispy_
            • 2 years ago

            There is no substitute for more frames, but you’re comparing apples to oranges. 100fps is always going to be better than 48fps.

            Don’t change the subject away from the topic at hand, TVs: Currently it’s 60Hz fixed. 100Hz or 144Hz discussions about more frames are irrelevant. It’s VRR or no VRR and I’ll take VRR if it’s an option, thanks; You’d be mad not to.

            • Duct Tape Dude
            • 2 years ago

            Fair point, I’d take 48+fps Freesync over a fixed 60Hz (or effective 30fps when Vsync’d). But for TVs in particular it gets tricky, and I think there is some relevance here because it’s not always 60Hz fixed: many 60Hz 4K TVs can also do 120Hz 1080p. At 120Hz, 24fps content doesn’t suffer from 3:2 pulldown, the benefits of VRR are lessened, and 1080p gaming on a 60″ TV becomes more appealing.

            The downside is you have to set the chroma and all that properly otherwise the scalers kick in πŸ™

            My point is: Yes, VRR is better. But if I had the budget for a replacement 60″ TV and were optimizing for gaming, maybe that TV budget is better spent on a beefier GPU.

            If I were optimizing for [i<]anything else[/i<], especially NTSC's crazy 23.976 or 29.97fps content, a replacement VRR TV would totally be more worthwhile. Plus, if I bought a replacement VRR TV, then I'd have a spare TV that I could put anywhere... like in the bedroom ceiling. Hmm... [i<]edit: forgot a word[/i<]

            • Chrispy_
            • 2 years ago

            If they made native 120Hz TV’s I think I’d be happy with that regardless of VRR because of the quantisation you mentioned:

            24, 30, 48, 60Hz content would all look perfect,
            23.976 would only skip one frame in 42 seconds,
            The vsync latency at 120Hz is only 8.3ms
            Even with a missed frame at 120Hz vsync, it’s still 60Hz which is fine.

      • DoomGuy64
      • 2 years ago

      The real question is, do the new consoles support freesync over hdmi or only displayport?

        • cygnus1
        • 2 years ago

        Well, they don’t support HDMI 2.1, so probably not so much

          • LostCat
          • 2 years ago

          It’s been said they are likely HDMI 2.1 compliant.

          – [url<]https://gamingbolt.com/xbox-one-x-hdmi-2-1-spec-yet-to-be-finalized-new-update-will-fix-blu-ray-issues[/url<]

            • cygnus1
            • 2 years ago

            They might be able to enable a subset of features, but unless they had early access(and by early, I mean a couple years ago when the current chips were being planned) to a mostly complete spec it probably needs new silicon.

            • DoomGuy64
            • 2 years ago

            I didn’t mean full HDMI 2.1 compliance, just whether or not VRR will work over HDMI on the consoles.

        • RAGEPRO
        • 2 years ago

        The Xbox One X supports FreeSync over HDMI for sure.

        The PS4’s (and Pro’s) HDMI connection is actually a DisplayPort connection with an internal adapter. I bet it could support FreeSync/HDMI 2.1 VRR with an update.

          • the
          • 2 years ago

          Active adapters, this one being integrated into the console per your comments, can’t be upgraded like that after the fact.

          Realistically we’ll likely see HDMI 2.1 support in the next revision of the PS4 Pro and Xbox One S that miniaturized the internals further and the SoC gets a die shrink. This is one of those features that is pretty straight forward to add and enable software developers to take advantage of. Any support of VRR on consoles will be opt-in from developers or manually forced by end users. Odd bugs in consoles tend to crop up when something like this changes due to developers having close access to the metal.

    • DancinJack
    • 2 years ago

    [quote=”HDMI Forum”<]Another aspect of the enhanced refresh rate capabilities is Quick Frame Transport (QFT) β€’ Each video frame travels faster from the source even though the source does not increase its frame rate and results in deceasing latency β€’ This reduces lag for gaming, real-time interactive virtual reality, and enables more responsive karaoke[/quote<] Does anyone know what they're doing here? There is virtually no info I can find on this QFT shiz.

      • brucethemoose
      • 2 years ago

      Hmmm, do normal HDMI frames have to wait for a clock signal?

        • RAGEPRO
        • 2 years ago

        I believe so. As far as I understand HDMI is based on the same TMDS signaling that DVI used. TMDS does require a clock generator, so following on from that…

    • chuckula
    • 2 years ago

    [quote<]The new feature list also includes Quick Frame Transport, a reduced-latency mode for games, mixed reality, and karaoke.[/quote<] This is worthy of a Nobel Prize! I can finally sing Free Bird with inside-the-wing-flap latency!

      • Pville_Piper
      • 2 years ago

      Is this where we all chime in and tell you not to quit your day job?

        • derFunkenstein
        • 2 years ago

        No.

        I’m gonna make some popcorn. BRB.

      • Neutronbeam
      • 2 years ago

      You scare me.

    • USAFTW
    • 2 years ago

    FIRST!
    Here’s hoping Nvidia gets off their high users and supports the full spec for consumers’ sake.

      • DoomGuy64
      • 2 years ago

      If VRR is a requirement of HDMI 2.1, Nvidia will have to at least physically support it to be compliant. If Nvidia still doesn’t support vrr in software, customers can complain and Nvidia won’t have an excuse. That, or they won’t support HDMI 2.1 at all, and consumers won’t like that either. Nvidia can’t keep blocking this feature without having major blowback now.

      • brucethemoose
      • 2 years ago

      I bet they’ll slap a “Optimized for GSync” sticker on certain HDMI 2.1 VRR monitors who’s OEMs pay the fee.

      I also bet they’ll support HDMI 2.1 VRR, but there’s no way they’re getting rid of Nvidia branded monitors.

        • USAFTW
        • 2 years ago

        Seems like a possibility. I would actually appreciate some quality control on Nvidia’s part as long as it doesn’t translate into a massive price hike.
        Segmenting G-Sync into two options, the pay-up-for-the-module premium version and module-less G-Sync essential for basic VRR would be fine, IMO.

          • DPete27
          • 2 years ago

          Except the module itself doesn’t really give you anything that FreeSync doesn’t also offer….so…

          • DoomGuy64
          • 2 years ago

          The module is complete snake oil, and technically not any better than freesync. The real reason why gsync is considered “better” is that Nvidia only allows high end monitors to use it, unlike freesync which is free for everyone to implement. It is essentially Apple like hardware quality control, and not a superior technology. Gsync is actually worse for being an expensive module, and it literally was a hack invented to kludge VRR support into outdated display standards, and only kept going because it was a good walled garden.

          Samsung’s Freesync2 [url=https://www.newegg.com/Product/Product.aspx?Item=N82E16824022583<]C27HG70[/url<] does everything Gsync is known for, plus HDR. The firmware can also be updated over USB, making it possible for modders to potentially add features. Now if only everyone else did the same.

          • brucethemoose
          • 2 years ago

          It doesn’t necessarily need the module. GSync would become a brand and nothing more, signifying some kind of quality control by Nvidia.

          Those stickers are like crack to OEMs. They just love em, gives them another bullet point in marketing material that consumers could recognize.

            • meerkt
            • 2 years ago

            As long as the stickers don’t tear and don’t leave residue, I don’t mind them.

Pin It on Pinterest

Share This