DisplayPort 1.3 supports 5K displays, 4K at 120Hz

Dell's upcoming 5K monitor reportedly requires dual DisplayPort inputs to drive its 5120×2880 panel at full resolution. Future displays should be able to support that resolution with a single cable, though. VESA has announced a new version of the DisplayPort standard with enough bandwidth to transmit uncompressed 5K video.

DisplayPort 1.3 boosts the standard to 32.4 Gbps, a 50% increase over DP 1.2. As with its predecessor, that bandwidth is split evenly between four lanes. Transport overhead takes a slice off the top, leaving 25.92 Gbps available for video—enough for a single 5K display at 60Hz, dual 4K monitors at 60Hz, or one 4K output at 120Hz. Pretty impressive for a single copper cable.

In addition to moar bandwidth, DisplayPort 1.3 supports HDCP 2.2 and HDMI 2.0 with CEC. It also works with the 4:2:0 pixel format, which is meant to prime the standard for televisions and "future 8K x 4K displays." DisplayPort 1.3 should be able to pump out 7680×2160 images at 60Hz. There's enough bandwidth for 8K output, too, but not without lowering the refresh rate or color depth.

DisplayPort 1.3's additional bandwidth is good for more than just high-res video. The standard includes tweaked protocols for sharing display and data signals on a single cable. Combined with the faster pipe, those adjustments should a boon to DockPort, with combines DisplayPort and USB 3.0 on the same interface.

Like previous DisplayPort standards, version 1.3 is available free of licensing fees. There's no word on when the first compatible displays and graphics cards will be available, however. I'll take a 4K IPS monitor with adaptive refresh rates up to 120Hz and a GPU fast enough to keep up. Please.

Comments closed
    • kamikaziechameleon
    • 5 years ago

    Really wondering when we will get a monitor with this spec 4K or more 30″-40″ in size 60 hz or better panel

    • Billstevens
    • 5 years ago

    Finally

    • mark84
    • 5 years ago

    “moar bandwidth”?

      • deruberhanyok
      • 5 years ago

      to be specific, displayport 1.3 has all the bandwidths.

      • Visigoth
      • 5 years ago

      [url<]http://en.wiktionary.org/wiki/moar[/url<] It's legit.

    • Krogoth
    • 5 years ago

    I give it two to three more years before monitors with DP 1.3 ports begin to show up. Maxwell and Hawaii’s successors will be the first GPU to come with DP 1.3 ports.

    • Kougar
    • 5 years ago

    That is awesome! I was just wondering about the possibility of 4K @ 120Hz. Here’s crossing fingers they make some affordable 4K @ 120Hz IPS-level quality display in the next several years.

    • PHGamer
    • 5 years ago

    Fucking Hell 4k is good enough and thats only if you have a huge TV; else 1440 is good enough. Fuck Dell for bringing 5k. Jesus

    • CheetoPet
    • 5 years ago

    Seems like panel technology is advancing faster than the signalling technology that feeds it.

    • TwoEars
    • 5 years ago

    Are they going to include free eye laser surgery?

    I honestly think 1440p is the current hardware limit of my eyes.

      • derFunkenstein
      • 5 years ago

      Same here. I really want a 27″ 2560×1440 display for the space, but a 5K would be wasted on me.

      • Laykun
      • 5 years ago

      Oculus Rift.

      • sjl
      • 5 years ago

      Frankly – for my needs – 2560×1600 at 30″ is more than enough. I can’t see myself justifying a larger, or higher resolution display for my purposes; I can’t even take in the entirety of the 30″ display (although it’s incredibly useful for displaying two or three documents, side by side, for cross referencing.)

      More pixels will increase the sharpness of text, yes, but I really do think that what’s out there now is more than ample for the vast majority of people. Rather, I think the value of these things is, one, large displays (eg: control room type situations), and two, driving down the price of the “smaller”, lower resolution displays.

    • Tristan
    • 5 years ago

    Good standard for 2-3 years ahead. But i had hope for 8K * 24bit/pixel * 60Hz. Maybe with compression this will be available, but with lower image quality. Pixel format 4:2:0 is good only for TV because of blurred edges.

    • September
    • 5 years ago

    First product: Apple iMac 27″ Retina: [url<]http://www.macrumors.com/2014/09/15/retina-imacs-displays-displayport-1-3/[/url<]

      • blastdoor
      • 5 years ago

      But does an iMac really need this since the display is integrated?

        • NeelyCam
        • 5 years ago

        I would expect the display part to be connected to the PC part using DP, even if the connection is internal

          • blastdoor
          • 5 years ago

          I guess my point is just that Apple doesn’t need to wait for something to be declared an official standard before using it for an internal connection. On second thought, I guess a lack of an official standard wouldn’t stop Apple from using something for an external connection, either. So… never mind 🙂

    • Prestige Worldwide
    • 5 years ago

    Monitor makers: Your move.

    • El_MUERkO
    • 5 years ago

    AU Optronics are releasing two new 32″ 4k panels in Q3/4 2014, hopefully they’ll be pared with a DP1.3 connections.

      • willmore
      • 5 years ago

      It will depend on what controllers are attached to them. The pannel, itself, doesn’t determine any of that.

    • Meadows
    • 5 years ago

    4K at 120 Hz.
    Make it happen. After that, shut up and take my money.

    (If such a display were to support G-Sync, that might be even better.)

      • sweatshopking
      • 5 years ago

      you going to sell some organs?

        • Meadows
        • 5 years ago

        Yes.
        Want to volunteer?

          • Airmantharp
          • 5 years ago

          I volunteer to assist with capture and extraction. Also, I’ll take pictures- gotta feed my other hobby.

            • Meadows
            • 5 years ago

            You dirty little man.

    • September
    • 5 years ago

    Why not compress the video stream? Surely with GSync type of processing on the monitor side it would have no problems uncompressing…

      • Parallax
      • 5 years ago

      Lossless compression gives inconsistent sizes for different frames, and doesn’t work well with higher bit-depths. What should happen if a frame can’t be compressed enough to fit within the bandwidth?

      Lossy compression introduces artifacts which many people, myself included, don’t want to deal with because it defeats the purpose of higher-resolution displays.

      Compression of either type could be done, but would also require decoding on the display side, further increasing costs and the complexity of displays.

        • September
        • 5 years ago

        So there is a technical challenge to overcome? Can’t be worse than the stuttering and tearing we have been subjected to for years! Maybe nVidia can solve this one too. 🙂

          • brucethemoose
          • 5 years ago

          With lossless compression, the cable/encoding has to be capable of handling the worst case scenario, which is compressed video that’s barely compressed.

          It’s not the technical challenge that’s the problem… It’s the minimal gains.

          On top of that, compressing/decompressing would increase power consumption, complexity, and add a bit of latency.

          • Klimax
          • 5 years ago

          Worst case is zero gain -> no saving at all and thus no advantage to complexity. (And still requiring bandwidth)

        • ferdinandh
        • 5 years ago

        Apparently AMD’s Tonga already uses lossless compression internally which does about 30%. Even if you were to use lossy compression you would only be able to see it in a worst case scenario: the image changes very fast like with tv static or the resolution is so high 8K+ that you can only see the compression if your tv has a ppi of 50.
        So I think using lossy compression for high ppi screens is a great idea and totally doable. It is an even greater idea for phone screens because of the power you can safe with ultra high ppi screens.

      • eofpi
      • 5 years ago

      Compression and decompression adds latency.

        • MathMan
        • 5 years ago

        Yes, if you mean microseconds. IOW: no big deal.

          • Airmantharp
          • 5 years ago

          Microseconds? Maybe to do the actual compression, but most compression and decompression algorithms worth actually employing would likely require a buffered frame on BOTH ends. At 60Hz, that’s 33ms. And that means no.

            • Waco
            • 5 years ago

            This. It’d be fine for casual use…but certainly not gaming or anyone sensitive to latency.

      • LSDX
      • 5 years ago

      While I wouldn’t want my CPU to waste time on compressing the video signal while gaming, this should actually be possible for mpeg2/h264/hevc streams.

      Most TV sets already have mepg2/h264 decoders, so why not. If it’s possible for audio (Dolby THD, DTS HD), it should be for video too. Including encryption/copy protection.

      Even for UHD+ streams (HEVC/DTS-HD), CAT5 could be good enough.

      But there’s is surely a dumb reason, the industry doesn’t want this. Be it either for copy protection (not really a problem, stream can be transported encrypted) or licensing issues (decoder in TV sets only licensed to decompress ATSC/DVB streams) or some other crazy stuff…

    • DPete27
    • 5 years ago

    20% overhead? Wow, that seems like a lot, not very efficient.

      • slaimus
      • 5 years ago

      It’s the same for many modern protocols: [url<]http://en.wikipedia.org/wiki/8b/10b_encoding[/url<]

        • NeelyCam
        • 5 years ago

        Except PCIe Gen3 is using 128b/130b for a much lower coding overhead.

          • DPete27
          • 5 years ago

          My question is then: What is the 20% overhead impacting (or) where is it occurring? Is this a tax on the GPU?
          If so, how much aggregate performance hit is it causing?
          What are the challenges of improving overhead efficiency by going to something like the 1.5% for 128b/130b that PCIe 3.0 uses?
          I have ZERO knowledge about this technical stuff.

            • LASR
            • 5 years ago

            Lossless? Good idea. But then you aren’t going to save too much bandwidth.

            Lossy? Terrible idea.

            • MathMan
            • 5 years ago

            Don’t look at it as a waste: the 10B/8B encoding is essential to allow transferring data with needing a separate line that has a clock.

            It has 2 main functions:
            – guarantee that the data line is always toggling irrespective of the content of the data. That is essential to allow clock recovering logic at the receiver side to extract the clock.
            – make sure that the average voltage on the line is zero. This is necessary to avoid charge build-up on the line which would result in clipping.
            – error detection/correction (though the latter is less common than the former.)

            130/128 coding of PCIe is quite a bit more complex but it’s there for similar reasons, except that it has a separate clock.

            • NeelyCam
            • 5 years ago

            MathMan doesn’t know his stuff.

    • drfish
    • 5 years ago

    These updates are really starting to mean something now. For the longest time DVI single vs dual link was the only thing that mattered and even then not to a large group. Now that 1080p has been positively mundane for a while we are seeing important specs that need to be seriously considered going forward for future purchases.

    • odizzido
    • 5 years ago

    So 1.3 includes freesync right?

      • Alexko
      • 5 years ago

      It seems to remain optional. 🙁

        • odizzido
        • 5 years ago

        boo to that.

      • DPete27
      • 5 years ago

      [url=https://techreport.com/news/26451/adaptive-sync-added-to-displayport-spec<]You can do freesync on DP1.2 already.[/url<]

        • Airmantharp
        • 5 years ago

        But it’s not mandatory, and I think that was his point (and a question that I asked while reading the article, but didn’t see an answer to until I saw Alexko’s response above).

        Not being a mandatory part of the spec means that monitor makers have to choose to add additional hardware for either adaptive V-Sync spec, after they choose which spec they want to include.

    • Neutronbeam
    • 5 years ago

    So what level of graphics card(s) can support this beastie? It’s way outa my league for sure.

      • Farting Bob
      • 5 years ago

      No graphics cards support this spec, because it was only announced today. But im sure by the next generation they will, even if no monitor/TV does.

    • Zorb
    • 5 years ago

    It’s about time….. let’s get it to the market!

      • cynan
      • 5 years ago

      In both desktop monitors AND TVs.

      Right now this is pointless as TVs will never support display modes requiring more bandwidth than the current fastest HDMI implementation.

      Having industry specific display connector standards is more frustrating than ever now that bandwidth demands are increasing so rapidly in display tech. I’m surprised that more makers of AV equipment who don’t receive HDMI royalties (eg, Vizio, Oppo) don’t include an extra DP alongside HDMI in their devices, even if only in vain hope of helping promote the death of HDMI someday.

    • jdaven
    • 5 years ago

    Since Thunderbolt 2 only supports displayport 1.2, does that mean that these higher resolutions will not work through Thunderbolt until version 3?

      • MadManOriginal
      • 5 years ago

      Sounds like a conspiracy for Apple to sell more Macs to me :p

      • chuckula
      • 5 years ago

      It’s all a numbers game. TBolt 2 does up to 20 GBits on a single channel (you could gang connectors together in theory…) so a single connector won’t push 32 GBits.

      However, Thunderbolt 3 is going to 40 GBits so it could handle a DP 1.3 transport without too much trouble. [url<]http://hexus.net/tech/news/peripherals/68981-intel-thunderbolt-3-doubles-bandwidth-shrinks-connector/[/url<]

        • adisor19
        • 5 years ago

        Nice ! Thanks for unearthing that leak 😀

        Adi

      • adisor19
      • 5 years ago

      Yeah, that is pretty much guaranteed. It totally makes sense that the port will be shrunk as Apple is pushing more and more for slimmer designs. That 12″ rumoured MBA should be nice when it lands sometime next year.

      Adi

Pin It on Pinterest

Share This