AMD will bring FreeSync to HDMI early next year

AMD’s FreeSync variable-refresh-rate technology has a lot going for it these days. Just pick out any FreeSync display you like on Newegg or Amazon and compare it to a similar display with Nvidia’s G-Sync tech on board. More likely than not, the FreeSync display will be a cool Franklin or two cheaper than its G-Sync competitor. FreeSync displays tend to have more input options than G-Sync monitors, too. The green team’s VRR (variable refresh-rate) screens often make do with a single DisplayPort connector. Intel has also confirmed that it’ll support the standard that underpins FreeSync—VESA Adaptive-Sync—in its future products. Given how many graphics processors Intel ships in its CPUs, its backing of Adaptive-Sync could have a decisive effect on the eventual outcome of the VRR wars.

In 2016, AMD wants to make FreeSync better and more broadly available. First, it’s bringing FreeSync to HDMI ports on compatible laptops and desktops. Second, it’s building support for DisplayPort 1.3 (with the High Bit Rate 3 link option) into its next-generation graphics processors. DP 1.3 with HBR3 offers a major increase in potential bandwidth, and AMD will put that extra throughput to use in some exciting ways. Third, FreeSync is coming to gaming notebooks.

AMD detailed these developments for us last week at a tech summit held by its Radeon Technologies Group. Let’s dive in.

FreeSync over HDMI brings buttery smoothness to more ports

We’ve already seen some improvements to FreeSync this year. In its Radeon Software Crimson Edition driver update, AMD enabled a feature called Low Framerate Compensation (LFC) for FreeSync displays whose maximum refresh rates are at least 2.5 times their minimum rates.

We now know that LFC’s software algorithm looks at frame times and sends additional frames to the display as needed to keep motion smoother when frame rates drop below the display’s minimum refresh rate. This improved method is a much-needed bit of polish, and we’re glad to have it.

In its present form, FreeSync requires a DisplayPort connection to function. That’s because VESA Adaptive-Sync, the technology that underpins FreeSync, is part of the DisplayPort spec. Most graphics cards have at least one DisplayPort output these days, but HDMI ports are even more common, especially on laptops. Problem is, the HDMI spec doesn’t include any kind of provision for VRR tech right now, so unless you have a fancy graphics card with a lot of DisplayPort outputs, configurations like a large FreeSync Eyefinity group could be hard to set up.

AMD wanted to bring FreeSync support to HDMI ports without waiting for the next HDMI revision, so it’s taking advantage of a feature called vendor-specific extensions to amend the HDMI spec for variable-refresh operation. This extension will allow present and future Radeons to talk to FreeSync-capable displays over HDMI.

Although AMD’s use of vendor-specific extensions may seem like the company is jumping the gun, it assures us that the move isn’t a risky one. If variable-refresh support is included in a future HDMI specification, that spec should be able to coexist peacefully with products that support AMD’s extensions. The company expects that its hardware should be compatible with a future HDMI specification for VRR operation, too.

FreeSync over HDMI will begin rolling out to Radeons that can already do the VRR dance in the first quarter of 2016, but monitor makers will have to build support for the tech into new displays. AMD has already partnered with Acer, LG, and Samsung to develop a number of HDMI-FreeSync compatible monitors in a dizzying array of aspect ratios and screen sizes. The company is working with Mstar, Novatek, and Realtek to incorporate its extensions into those companies’ display controller chips, as well.

FreeSync over HDMI is also coming to laptops with both an AMD APU and Radeon discrete graphics inside. The company’s extensions mean that laptops with those components will be able to perform VRR over their external HDMI ports, too.

 

DisplayPort 1.3’s copious bandwidth lets UHD content shine

Although we can’t share many details of them yet, AMD gave us a high-level overview of its next-generation Radeons last week. One of the tantalizing features baked into these upcoming GPUs is support for DisplayPort 1.3 with the High Bit Rate 3 link option. This type of DisplayPort link offers a whopping 32.4 Gbps of bandwidth, about 80% more than HDMI 2.0, and it can move all those bits using existing cables and connectors. DP 1.3 has several exciting implications for next-generation FreeSync displays, as well as for higher-resolution panels like 5K screens.

For one, DP 1.3 can drive 4K (or 3840×2160) displays with full color data at up to 120Hz. DP 1.3 also allows a graphics card to drive a 5K (5120×2880) display over a single cable at 60Hz with full RGB color. Consider that a 5K display has 78% more pixels than consumer 4K displays, and you start to get a feeling for just how much data is moving across that single DisplayPort 1.3 link.

That extra bandwidth isn’t just for driving gobs of pixels, though. A theme of AMD’s summit was making “better pixels,” not just more pixels. By AMD’s definition, better pixels will be produced with a wider color gamut and a broader dynamic range. To guide this effort, the company has taken a page or two from the burgeoning ultra-high-definition (UHD) segment of the consumer electronics industry .

UHD content can be produced with an extremely wide color gamut (Rec. 2020). UHD also uses a modern transfer function (SMPTE ST 2084) to encode a wider range of brightness values for UHD TVs and monitors, commonly known as high dynamic range, or HDR.

For reference, AMD says typical consumer displays today range from about 0.1 to 250 nits of brightness. Those displays also use the relatively narrow sRGB color space and a transfer function designed to mimic the characteristics of CRT displays (Rec. 1886). That function only accounts for a brightness range up to 100 nits, and it’s poor at encoding the darker parts of an image. 

UHD OLED displays could boost that maximum brightness to 500 nits (with pure blacks, thanks to the underlying display technology), while future LCDs could range from 0.0005 to 1000 or even 2000 nits next year. Those LCDs are will likely use local backlight dimming to acheive higher contrast.

10-bit color support is a rarity on today’s displays, but AMD expects it’ll be a key feature of displays going forward to support the wider color ranges required by UHD content.

Another piece of the UHD puzzle on the PC is Windows’ support for 10-bit color and HDR. Right now, Windows offers limited support for 10-bit applications, and the OS doesn’t support HDR content at all.

AMD says it can get around these issues for the moment by rendering UHD content in an exclusive full-screen context, since that mode bypasses Windows’ limitations. That’s not an ideal situation, but it should work fine for activities like gaming and movie-watching that are generally full-screen in the first place. As UHD content becomes more common, we’d expect that Microsoft will develop more fine-grained ways of handling and displaying that content outside of a full-screen context.

Today’s Radeon graphics cards will be able to support UHD gaming and photos, while those next-generation Radeons with HDMI 2.0a and DisplayPort 1.3 will be able to output 4K UHD content (including movies) at up to 60 frames per second with 10 bits per color channel.

The Radeon Technologies Group had a couple of UHD displays running HDR at its tech summit to demonstrate UHD content, and we were impressed by the clearer, more natural-looking images these displays showed, especially next to a (frankly broken-looking) conventional HDTV. The ugly tone-mapping that might have reared its head in older games on conventional displays was nowhere to be seen. UHD will probably get a lot of hype from consumer electronics companies soon, but based on these quick previews, we think these displays (and the content they’ll show) will be an exciting advancement.

You can see some examples of the higher refresh rates and different content types that DP 1.3 monitors may support in AMD’s image above. We’re most excited about the expected specifications of those 2560×1440 and 3440×1440 displays. 170Hz or 144Hz HDR content with FreeSync enabled? Sounds great.

Don’t expect these panels to hit the market immediately, though. AMD projects that single-cable-ready 5K displays will arrive in mid-2016, while 120Hz 4K panels with FreeSync support are expected to arrive in the fourth quarter of 2016. Even so, these higher-refresh-rate displays sound like they could herald a bold new era for the gaming monitor, and it’ll be interesting to see what GPUs AMD has up its sleeve to drive them.

FreeSync goes mobile, too

Gaming notebooks with Nvidia’s G-Sync onboard have been available for a while, and now AMD is getting in on the mobile variable-refresh game from a somewhat more entry-level angle. The one qualification is that any discrete Radeons must be driving the laptop’s internal display directly, not through an Intel IGP. The company unveiled the first of these notebooks, a FreeSync-enabled version of Lenovo’s Y700 laptop, at its summit.

This machine has a 15.6″ 1080p display that can run in a VRR range of 40 to 60Hz. While it’s nice to see FreeSync make its mobile debut, we’ve got to pick a nit. Given the demands of FreeSync’s LFC tech, we’d have loved to have seen a display with a broader refresh-rate range, like 30 to 75Hz at a minimum.

This particular Y700 is a fairly nice-looking machine that’s powered by an FX-8800P APU and Radeon R9 M380 graphics. The FreeSync panel doesn’t use a scaler chip, a typical configuration for a notebook. Instead, AMD uses some of the FX-8800P’s resources as a display controller to make the VRR magic happen.

AMD didn’t provide the full specs of this red-blooded Y700, but we did some investigating for an idea of what this machine’s $899 list price gets you. The R9 M380 graphics chip uses 10 GCN compute cores for a total of 640 stream processors, clocked at a round 1000MHz. AMD’s specs say the M380’s 128-bit memory bus can be joined to up to 4GB of GDDR5 memory running at up to 1500MHz. Based on those specs, the M380 should slot in somewhere between the GeForce GTX 950M and GTX 960M.

Mobile G-Sync is only available in notebooks with GeForce GTX 965M cards and better, and notebook makers seem to have reserved the feature for the $1500-ish price bracket, so Lenovo’s Y700 could represent a nice way to get VRR on the go for less. It remains to be seen how many other manufacturers will hop onto the mobile FreeSync bandwagon.

Comments closed
    • fredsnotdead
    • 4 years ago

    HDR is conspicuously absent on the 5k display. Need moar bandwidth!

    • meerkt
    • 4 years ago

    “ahceive” -> achieve

    • WaltC
    • 4 years ago

    I’m still using HDMI, although I bought a decent DP-to-HDMI cable that works well. Next monitor, though, will be DP all the way.

    • juzz86
    • 4 years ago

    From the very bottom of my heart thankyou, AMD, for LFC. It is a thing of beauty and has made ARK (and indeed other titles/engines that don’t support CrossFire) much more palatable on my XL2730Z.

    Sincerely,

    Happy 295X2 Owner

    • Bensam123
    • 4 years ago

    Am I the only one that doesn’t like dynamic contrast?

    I HATE, literally hate when my monitor changes its brightness on it’s own. I spend a lot of time calibrating it to just the right brightness. With something like dynamic contrast or the new UHD it’ll blind me whenever the game or movie feels like doing that?

    Right now game makers have enough problems trying to implement HDR lighting that isn’t absolutely useless (walking out a door is like going from the darkest nether regions of a cave to a desert). I turn it off whenever I can because it’s almost always overdone. Adding something like this to it is just going to make it all the worse.

      • NTMBK
      • 4 years ago

      There is a difference.

      The annoying “oh god I’m blinded” effect in games is a dynamic remapping from HDR space to 8-bit space. Increasing the true bit-depth to 10-bits means that each individual channel can encode 4 times as many different values, meaning that those dynamic mapping tricks should be [i<]less[/i<] necessary.

        • Bensam123
        • 4 years ago

        Instead of increasing the brightness on the monitor, they decrease the brightness of dark areas in game. It’s just as annoying even if actual brightness doesn’t increase. Both are bad.

      • meerkt
      • 4 years ago

      HDR isn’t dynamic contrast.

        • Bensam123
        • 4 years ago

        Nope, but it does a very similar thing to achieve a different effect. They increase the brightness in the monitor to bring out certain colors and darken the monitor in dark scenes to achieve less washed out colors. It doesn’t try to achieve the same effect, but it does the same thing to achieve what they’re looking for. Both are annoying and why I pointed them out because it messes with brightness.

          • meerkt
          • 4 years ago

          The problem with dynamic contrast is that the brightness of the blacks and whites is tied together in LCD. It’s also not controlled well in terms on response speed.

          I assume you’re not against better blacks, or a better resolution for the extended range. That leaves the bright end. I assume it’s not meant to glare you, just add vividness. And it’ll be adjustable anyway, just like existing brightness controls.

            • Bensam123
            • 4 years ago

            No to both of those, but the way it’s done adjusts brightness and in my opinion that messes with clarity. When your eyes are adjusting you don’t have a very complete view of what you’re looking at.

            Moderate adjustments are one thing, going from one extreme to the next however isn’t. It’s not the speed that’s the problem either. I think going from a really dark scene to a really bright scene just makes it worse as there is no transition. Where as if you walk outside there is almost always a certain amount of transition time between the two. On extremely bright days it can, at least for me, be straight out uncomfortable.

            You assume there will be options. Most games you can’t turn HDR off in anymore as it gives you a ‘competitive edge’, so they leave it on all the time.

            • meerkt
            • 4 years ago

            My understanding is that so far HDR is used for just small hilites. More radical things may happen in the future, but that’s for movie makers to decide. Either way it’s not going to be as bright as a sunny day.

            Display settings aren’t controlled by movies but by your monitor firmware. I’m yet to see a monitor without adjustable brightness. I’m highly certain that if people find wide brightness swings unpleasant, and if these become common in movies, you’ll have an “HDR range” setting to adjust to your liking.

            • BobbinThreadbare
            • 4 years ago

            I think you are confusing HDR the photographic technique and HDR the rendering technique.

            • meerkt
            • 4 years ago

            What’s makes you say that?

            • BobbinThreadbare
            • 4 years ago

            Because you’re talking about movies and not the annoying bloom effects modern video games use.

            • Ikepuska
            • 4 years ago

            That’s because the monitor tech is more like what’s used in movies. It increases the dynamic range. Just like in music. So your single scene contrast ratio will increase. That means in a single frame you can have a very bright and very dark spot. Nothing like the awful bloom of going from a dark scene to a bright one. That’s a different tech.

            [url<]http://www.cnet.com/news/high-dynamic-range-arrives/[/url<]

      • anotherengineer
      • 4 years ago

      Indeed.

      I usually set to manual and set it to 120 cd/m2 when I calibrate it and leave it that way.

    • Cuhulin
    • 4 years ago

    I think that it is too narrow a view to see VRR over HDMI as simply a matter of speaking to monitors in the GSync v FreeSync sense.

    We are in the midst of a major change in video standards both for computers and for televisions. VRR is just one of the developments that is changing things. After all, high-end monitors often have display-port connections. Televisions generally do not, mainly due to HDCP being required by Hollywood initially due to its fear of copying and now, imo, from liking the licensing fees collected for HDMI/HDCP connections. If AMD is going to compete in the increasingly hybrid world of set top boxes, computers hooked up to tv’s, and so on, it needs to bring its key technologies to HDMI.

    In addition, it is my opinion – and just an opinion – that VR is going to be a big thing in coming years, particularly for the enthusiast customers who currently pay the prices for GTX980ti or Radeon Fury type cards. The hardware demands that are being cited for effective use of this, like 90 FPS at 1080p resolutions, more realistic color spaces and the like will sell a ton of video cards in an era when new games rarely require a new CPU, let alone a new GPU. These cards are where the profits per card are highest for the manufacturers. I have no personal knowledge of this, but I have to believe that the potential profits have them working hard to support the new VR displays – but those too are likely to require HDMI and some form of HDCP as they become more general use items, and to do so with VRR included because micro-stutters interfere with the VR experience.

    In other words, this is a really big development, far beyond Samuel Jackson asking “What ports are on your monitor”.

      • NoOne ButMe
      • 4 years ago

      Indeed. “What ports are on your TV” is now the question,

    • DPete27
    • 4 years ago

    Didn’t read all comments since I’m on mobile, but from my understanding, LFC “corrects” the only preexisting difference between free sync and gsync….can we please drop this scharade now Nvidia?

    • Theolendras
    • 4 years ago

    Timely, right before 4K really trickles down to the masses.

    • Freon
    • 4 years ago

    HDMI support shouldn’t be the leading item here, LFC should be. Gsync’s ability to double (or triple, or quadruple?) frames up has been a huge advantage up to this point. Now I think they’re on a much more level playing field.

    Keep in mind a lot of Freesync displays out there have a very narrow refresh range. 45-60, or 45-75. They’ll be left out in the cold, and I strongly recommend not purchasing one. I’d rather just have a fixed refresh rate than just a jarring experience when framerates start to touch on the minimum refresh rate of the monitor.

    MG279Q should be good to go, though. 35-90 is just over the 2.5:1 ratio required.

    This video does a great job showing the issue LFC should solve, vs Gsync which had always solved it:
    [url<]https://www.youtube.com/watch?v=VkrJU5d2RfA[/url<]

      • NoOne ButMe
      • 4 years ago

      That is NOT what Gsync does according to Damage report diving into Freesync where he talked to an Nvidia engineer about it instead of making up garbage like PCPer did. And it would be terrible. If you display s frame 2,3,4 times that means your input lag is effectively huge.

      On this page: [url<]https://techreport.com/review/28073/benq-xl2730z-freesync-monitor-reviewed/3[/url<] It is likely the solution AMD is/has added to it's software is similar to what Nvidia has on it's FPGA.

    • auxy
    • 4 years ago

    I have been saying for a VERY long time that I’m more interested in greater chroma resolution than spatial resolution. I love super-sampling and I love high-resolution displays but 8-bit per channel is old and has been old and has got to go.

    It’s POSSIBLE (my dad is colorblind and I’m a genetic mutant anyway) that I have tetrachromatic vision. I’ve never really looked into it that much, but I’ve suspected it for a long time as I’m very, very sensitive to variations in color. Or it could just be autism. ┐( ̄ー ̄)┌

    It’s for this reason that I don’t like watching Youtube videos or indeed most digital video, which almost always heavily subsamples chroma data and looks awful and washed-out to me. (I’ve also given more clothes than most people will ever own to charity because the color faded just ever so slightly and it wasn’t the right shade anymore. I can’t help it! It looks bad to me!)

    Either way, AMD comes through again, addressing the concerns I have. Nobody else pushes the envelope this way. Kudos to AMD!

      • davidbowser
      • 4 years ago

      I vote for the mutant option. 🙂

      One of the craziest things that happened when I calibrated (Datacolor Spyder) one of my monitors was that some colors on video started to look weird. I watch TV on that computer and live sports looks particularly odd with peoples lips and skin tone just off. Although I have not really investigated, I began to wonder if TV calibrates color for broadcast using a different default than what I am using.

        • auxy
        • 4 years ago

        Look up the differences between Rec.601 vs. Rec.709. Ahh, I hate video content. The old analog TV and film standards just screw up everything. ;つД`)

        [sub<]Also, I wasn't saying "possibly I'm a mutant", I was saying "possibly I have tetrachromatic vision" -- me being a genetic mutant is known and confirmed![/sub<]

        • tipoo
        • 4 years ago

        Yup, broadcast TV and PCs have different colour spaces, that would be it. The difference can be annoying when trying to cross streams.

      • UnfriendlyFire
      • 4 years ago

      I just got done with arguing with someone that there’s a difference between 480p 30FPS video and a 1080p 60FPS video. Said person also had a 1080p display, but couldn’t see a difference.

      “480p is good enough. What’s the point of 720p and up?”

        • auxy
        • 4 years ago

        [b<]MAIM THEM AS A MESSAGE TO THE REST くコ:彡[/b<]

          • LostCat
          • 4 years ago

          Seems like someone already did.

        • LostCat
        • 4 years ago

        • NeelyCam
        • 4 years ago

        How is this even possible? Unless the person was trolling…?

        • Timbrelaine
        • 4 years ago

        Maybe they should stick to podcasts?

      • Mr Bill
      • 4 years ago

      Interesting, Googled tetrachromatic, found a test, and if this actually real, I seem to be also. I wonder if this is why my Nikon D80 photos seem so washed out but my friends more expensive Cannon 5DS always seems to have better color.

        • auxy
        • 4 years ago

        Normally it’s only women as far as I know…

          • Mr Bill
          • 4 years ago

          I see 37… [url=http://fox6now.com/2015/03/02/how-many-colors-do-you-see-this-simple-test-may-or-may-not-reveal-something-fascinating-about-your-eye/<]tetrachromatic test[/url<] Although one of each color seemed to be more of an intensity difference than a color difference. Perhaps its the combination of GPU (A10-7850 BE) and monitor (Asus PA248Q IPS). I have no hardware to check color accuracy.

            • auxy
            • 4 years ago

            Did you even read the article…

            That said, it says you need two X chromosomes to be a tetrachromat and I only have one (and no other sex chromosome) so I guess not! (*’▽’)

            • Mr Bill
            • 4 years ago

            I read it, and if this even exists, only females get it. So no way I have it. The online thing must be bogus. I see 37 shades, intensities, or bands, whatever.

            • Arbiter Odie
            • 4 years ago

            That article, after the test, states

            “BUT, is this color test truly accurate?
            The online debunking site Snopes reports computer monitors are not capable of displaying the range of colors required for this test. Snopes says:

            “According to the researchers at New Castle University’s Tetrachromacy Project, computer screens do not provide enough color information to be able to ‘tap into’ the extra dimension that tetrachromats may possess. It is therefore impossible for an online test to investigate tetrachromacy.”

            I see 38 colors, and 39 distinct bars. I’m male, and I’m also pretty sure that color test doesn’t work. AH-IPS monitor, Dell P2414H in case it matters.

            • Mr Bill
            • 4 years ago

            + Agreed

            • Chrispy_
            • 4 years ago

            As a dueteranopic mutant myself who has spent the best part of thirty years reading and researching my 1-in-400 vision weirdness, I’d like to point out that any colourblindness test performed through an LCD panel is 100% worthless.

            • oldog
            • 4 years ago

            Turner’s???? Sorry, don’t mean to pry but the doc part of my brain still pricks up it’s ears now and again even in retirement. Feel free to ignore me by the way (my wife does as well).

            • auxy
            • 4 years ago

            A form of yes. (*’ω’*)

            • Forge
            • 4 years ago

            That’s not entirely true. Tetrachromacy is more common in females (~25% possessing some degree of tetrachromacy), but it does occur in males (sub-8% in most surveys), because while two chromatic genes are expressed from X and only one from Y, there are mutants with duplications. It’s possible to have an unlimited number of chromatic expressions, though duplications are generally an indicator of Bad Things happening to the genome.

            • Waco
            • 4 years ago

            I count 43 colors…but I highly doubt there’s anything special about my vision.

      • Cuhulin
      • 4 years ago

      The new HDR spec for video announced this past Summer should lead to better chroma resolutions shortly. They will trickle down from high-priced to consumer priced over time, but companies like Vizio will cause that to be a relatively short time.

      • Mr Bill
      • 4 years ago

      Have you tried HDR photography? Not for the chroma range but for exposure range. They are pretty neat…. [url=http://www.hdrsoft.com/<]Photomatix[/url<]

      • anotherengineer
      • 4 years ago

      8-bit??!

      Most panels are still 6-bit + FRC

      • Sammael
      • 4 years ago

      I too am more interested in HDR and the types of contrast we can get with things like oled displays. My only concern is that even displayport 1.3 is too limited to be the holy grail connector for the foreseeable future.

      One of those shots showed which resolutions HDR is supported on.

      4k and HDR @60Hz is OK
      4k and HDR @120Hz is completely missing.

      the ultimate display ought to have all of the following attributes

      -4k
      -120Hz
      -vrr range of 30-120Hz
      -low input lag
      -oled for perfect blacks and better contrast.
      -HDR

      Is there no bandwidth room left to use 4k with 10 bit color and HDR at above 60Hz? If there is room how much room?

      Is the lack of available bandwidth related to the overhead for hdcp 2.2? Is there a way to disable that and free up bandwidth for gaming applications that are not video playback related?

      If the answer is no, what about supermhl? That seemed like it came out of the gate with far more bandwidth than displayport stuck on version 1.3

      Is there any hope of including that on cards or is that too far out?

        • meerkt
        • 4 years ago

        OLED may not be the ultimate tech. You should list features, not technologies. 🙂 Perfect blacks and very good viewing angles are a big step in the right direction, but there are other issues. I think we haven’t seen yet OLED computer monitors due to image retention/burn-in or lifespan issues.

        I’d prefer to have a monitor capable of going down to 20Hz. But anyway I suspect OLED solves the problem of low variable refresh, since pixels respond much quicker.

        There’s no HDCP in PC graphics, and anyway I don’t think it uses real bandwidth.

      • Welch
      • 4 years ago

      LOL not trying to be a jerk here Auxy, but everything you just described explains A LOT about you and our conversations in the past ahaha.

      Fun read here though on tetrachromatic vision….

      [url<]http://tinyurl.com/mpshcht[/url<]

      • anotherengineer
      • 4 years ago

      Color deficiency is more common to males and passed down to males typically, not females.

      Now if you have a calibrated monitor you can try a few of these tests to determine what areas you have color deficiency in.

      [url<]http://www.color-blindness.com/ishihara-38-plates-cvd-test/#prettyPhoto[/url<] I know this because i have some red-green deficiency from my mom's dad.

    • mkk
    • 4 years ago

    When it comes to TV sets or largish monitors though, adding a DP port for more demanding uses can’t add that much to the end price. Or is DP more expensive to license specifically for TVs somehow..?

      • Deanjo
      • 4 years ago

      There is no license fee to add DP.

        • cynan
        • 4 years ago

        Whereas there is for HDMI. Can’t figure out why companies like Vizio (who don’t get any royalties) don’t throw a DP port on all of their TVs to distinguish themselves. (heck, their TV-sized monitors used to have DVI)

        Then again, as far as I know, Samsung and LG don’t get royalties for HDMI either, and they now have the largest TV market share.

        I guess the answer is that the number of people who would actually interested in optimally hooking up a PC to a computer (beyond “hey, I get a picture!”) is so small that it’s just not worth any extra investment whatsoever.

          • meerkt
          • 4 years ago

          I’m guessing Vizio just goes with whatever the ODM of the electronics offers?

          BTW, DP doesn’t mix well with AV receivers.

    • The Egg
    • 4 years ago

    Only very new displays are going to be capable of Freesync, meaning they will already have Displayport. It’s all well and good to have another connection method, it just isn’t much of a “need” at the moment.

    I’d rather see them spend their efforts on getting a better refresh rate range. Many Freesync monitors only run adaptive sync between 40-90hz for instance, which isn’t very good at the bottom of the range (where adaptive sync is most useful).

      • EndlessWaves
      • 4 years ago

      It does seem an odd feature without an obvious audience. I wonder if it was a relatively small amount of effort and AMD felt that the PR value outweighed the developer time taken away from more useful features.

      Refresh rate range is down to the monitor manufacturers though. AMD stated from the start they support ranges such as 9-60Hz and 17-120hz so your 90hz screen should be able to go down to ~13Hz as far as AMD are concerned.

        • The Egg
        • 4 years ago

        I dunno, I think there’s more to it. Freesync displays consistently have narrower ranges than G-Sync, even when the chance is high that they’re using the same panel. It could be that the scalers are a bit more limited.

          • NoOne ButMe
          • 4 years ago

          Yes. Using a cheap scalar relative to the cost of an FPGA does that.

      • jensend
      • 4 years ago

      I’m plusing you for saying we need more attention paid to the bottom of the refresh rate range. People underestimate how important this is and overestimate the importance of 100+Hz max refresh.

      But you’re wrong about not needing adaptive sync over HDMI. DisplayPort is technically superior, was supposed to be royalty-free, and should have seen rapid adoption to replace DVI, but uptake has been frustratingly slow, and it looks unlikely that manufacturers will change that soon.

      I have to imagine this is partially due to back room politics involving the MPEG-LA (who claim that everybody using DP has to pay them royalties, though they had nothing to do with DP and their IP claims are probably garbage) and the HDMI Licensing people.

      If this move gets adaptive sync to more consumers it’s a win, especially if their method is eventually adopted as part of a future HDMI standard.

      • MrJP
      • 4 years ago

      From the article, doesn’t LFC essentially halve the minimum refresh rate of all supporting Freesync monitors anyway? If the monitor can do 40Hz and upwards, then LFC allows it to cope with frame rates down to 20Hz by doubling up on refreshes. That seems like a pretty significant step forwards.

    • anotherengineer
    • 4 years ago

    I see 10 bit per channel throw around a bit. Does W10 desktop actually support full 10 bits per channel color?

    From what I recall about TRUE 10-bit color, is that the graphics card must support it, the monitor must support it, and under certain applications the OS must support it? Or is that not the case anymore??

      • drfish
      • 4 years ago

      I think if your monitor supports it then it’s mostly a driver thing after that, the realm of Quadros and FirePros I think.

        • anotherengineer
        • 4 years ago

        Yes firepro’s and quadro’s to support 10-bit color.

        If consumer/gamer cards start to support 10-bit color in 2016 and affordable “true” 10-bit color monitors become more affordable and mainstream then yeah for us!!

      • Ryu Connor
      • 4 years ago

      Still the case.

      [url<]https://photographylife.com/what-is-30-bit-photography-workflow[/url<]

      • Jeff Kampman
      • 4 years ago

      UHD content isn’t just about 10-bit color. HDR content is a problem for Windows right now, since the OS has no native concept of what to do with it, as far as I know. I added the following paragraphs to the piece to clarify what Windows needs to do in the UHD chain:

      “Another piece of the UHD puzzle is Windows support for 10-bit color and HDR. Right now, Windows offers limited support for 10-bit applications, and the OS doesn’t support HDR content at all.

      AMD says it can get around these issues for the moment by rendering UHD content in an exclusive full-screen context, since that mode bypasses Windows’ limitations. That’s not an ideal situation, but it should work fine for activities like gaming and movie-watching that are generally full-screen in the first place. As UHD content becomes more common, we’d expect that Microsoft will develop more fine-grained ways of handling and displaying that content outside of a full-screen context.”

        • anotherengineer
        • 4 years ago

        Thanks. So it looks like they missed a few things in W10 after all.

          • sweatshopking
          • 4 years ago

          LUCKILY IT ISN’T FINISHED YET. DOUBT IT BUT MIGHT SEE IT WITH REDSTONE.

            • anotherengineer
            • 4 years ago

            Hope so. I’m sure eventually it will make it in, hopefully before w11.

        • jihadjoe
        • 4 years ago

        They should make more use of DirectX and use it to render the desktop. But I suppose it would feel kinda silly using DX12 to render the new flat theme.

      • drfish
      • 4 years ago

      So following up on this since I happen to now own a 10 bit panel since buying a LG 34UC87C. I got home and was able to select 10 bit color in the Nvidia control panel. However I don’t have a copy of Photoshop or any means of verifying that anything is actually different. Will post more as I learn it.

        • anotherengineer
        • 4 years ago

        I do not believe that is a TRUE 10-bit panel, but an 8-bit with FRC.
        [url<]http://www.tftcentral.co.uk/reviews/lg_34um95.htm[/url<]

          • drfish
          • 4 years ago

          Yes, a 10bit [i<]capable[/i<] panel. FWIW to future Googlers the actually panel is a LM340WU2-SSA1 not the LM340UW1-SSA1 in the 34UM95.

            • anotherengineer
            • 4 years ago

            My Bad.

            Does it require DP to do it? Also not sure if a 980ti officially fully supports 10-bit either?

            • drfish
            • 4 years ago

            It does require DP. “official” support I’m not so sure about but it does let me select it and it switches to it. I just have no way to test it afterward. I think I’m good with hardware and drivers, just lacking the software and the content.

            • anotherengineer
            • 4 years ago

            hmmmmmm.

            Did you try Ryu’s link from above, there was a gradient in there, but I don’t know if that’s a definitive test though?

            • drfish
            • 4 years ago

            From my understanding I can’t get a valid test without Photoshop or something that support the output. A web browser won’t cut it. 🙁

            • anotherengineer
            • 4 years ago

            Whaw whaw whaw……………
            bummer

            [url<]https://cabbagepatchcait.files.wordpress.com/2013/05/debbie-downer.png[/url<] I wonder if a monitor calibration kit could validate it?

        • Pantsu
        • 4 years ago

        AMD and Nvidia only enable 10-bit support for FirePro and Quadro products in applications like Photoshop. GeForce and Radeons only support 10-bit for fullscreen DX applications, i.e. games. And there’s no real support for that in games currently. Alien Isolation has a deep color option, but I haven’t seen it make any difference.

    • wiak
    • 4 years ago

    “FreeSync over HDMI is also coming to laptops with an AMD APU or Radeon discrete graphics inside. The company’s extensions mean that laptops with those components will be able to perform VRR over their integrated HDMI ports, too. The one qualification is that any discrete Radeons must be driving the laptop’s internal display directly, not talk to it via an Intel IGP. We’ll take a look at one of these all-AMD gaming notebooks in a moment.”
    there are no “internal hdmi ports”, those are infact eDP aka Embedded DisplayPort..
    and amd laptops with cariizo would work i assume, as they are all gcn 1.2+

      • Jeff Kampman
      • 4 years ago

      Sorry, we didn’t mean to imply that laptop displays are connected to the system’s graphics chip with some form of HDMI. You are correct in that any FreeSync notebook will be talking to its display via eDP.

      The correct sense of the paragraph is that AMD notebooks with both an APU and a Radeon discrete graphics chip will be able to perform FreeSync with external monitors using the machine’s HDMI port, if it has one. I’ve attempted to correct that section for clarity.

      • Forge
      • 4 years ago

      You misread. No one said “internal HDMI ports”. INTEGRATED. He’s saying that HDMI ports connected directly to the dGPU will work with Freesync, but ones routed through the Intel iGPU for Optimus or whatever AMD is calling the same tech will not work.

      • Ninjitsu
      • 4 years ago

      Why would you put “internal” in quotes when it’s not a quote?

    • Chrispy_
    • 4 years ago

    It’s a nice idea but I bet it’ll be two or three years before we see a freesync-capable HDTV.

    Fixed-frequency displays are so outdated but it takes television makers [i<]forever[/i<] to do anything that isn't directly related to broadcast or disc-based sales. Hell, even 24p support took two generations of optical media to come and die before displays and players existed that could support the format the media contents were recorded in correctly.

      • The Egg
      • 4 years ago

      [quote<]It's a nice idea but I bet it'll be two or three years before we see a freesync-capable HDTV.[/quote<] I think 3 years would be extremely optimistic. Gaming is the only use-case for adaptive sync, and the current generation of consoles aren't capable. TV manufacturers will have no incentive to provide adaptive-sync features until after the next generation of consoles are released.

        • Chrispy_
        • 4 years ago

        I would have thought that adaptive sync is perfectly suited to the primary purpose of a TV – media playback:

        Anime = 6, 8, 12, or 24Hz
        Movies = 24p (AKA 23.967Hz) 24Hz, or 48Hz
        PAL region broadcast = 25 or 50Hz
        NSTC/SECAM region broadcast 30 or 60Hz
        Streamed web events 20, 30, or 60Hz

        What I mean is that if you plug in a media box, content box, PVR or whatever, 60Hz HDMI is a severe limitation when there’s a vast library of content that doesn’t play back at a nice 60Hz divisible framerate. And then of course you have to remember that most TV’s are designed to be connected to current-gen consoles, all of which are powered by AMD’s GPUs with GCN and supposedly freesync capability via a firmware update (if it’s not already integrated but just dormant).

          • meerkt
          • 4 years ago

          Decent TVs already support pretty much all of these through various standard fixed-refresh rates: 23.976, 24, 50, 59.94, 60. Integer divisions give also 30, 29.97, 25, and 20, 12, 6 (wherever you get these odd framerate sources from).

          You won’t always get all rates since ~24Hz is apparently considered an “advanced” feature, and maybe some US models have trouble with 50Hz.

      • meerkt
      • 4 years ago

      I doubt TVs will support adaptive sync anytime soon. PC/console support is an afterthought at best. It will probably only happen if a future HDMI revision supports it. That might take years to happen, if ever, then a few more years until TVs and other hardware support it (properly).

      I think 120Hz support is more likely.

        • drunkrx
        • 4 years ago

        There already is one Korean TV manufacture that supports FreeSync with a firmware upgrade. Though it only gives you 40-60hz range. It is a least a start. We just need the big names now to support.

          • meerkt
          • 4 years ago

          Which one? If I recall correctly, some of the cheap/unknown brand TVs also allow/work at >60Hz on HDMI, but that didn’t translate to widespread support. As long as it’s not LG or Samsung I don’t think it’s a trend indication.

    • DancinJack
    • 4 years ago

    Blah. Those prospective HDMI Freesync panels look not exciting at all to me. A whole lot of 1080p.

      • Demetri
      • 4 years ago

      I’m more interested in the panel type and refresh rate. Pretty sure several of those will be IPS, but doubt any are also 120hz+. I don’t think Samsung even makes a high refresh monitor at the moment. 1080 is cool with me because I only want a 24″ panel, and want to push a lot of frames to it.

    • USAFTW
    • 4 years ago

    Good news. Here’s to the day when every panel sold will come with VESA adaptive sync spec capable scalers. Something they should have had since the first LCD panels came out.

    • llisandro
    • 4 years ago

    Was pondering Pascal, but am now holding out for the blinged-out D@mAge-branded next-gen AMD GPUs.
    We all know that’s the real reason for the Wasson acquisition…

      • ImSpartacus
      • 4 years ago

      I will lose my shit if something like that happens. That’d be hilarious.

        • llisandro
        • 4 years ago

        picturing a video of ad of him walking down the street, slowly and methodically.

        [quote<]I may be a little slower, but I take every step at the.......exact.......same.......pace.[/quote<]

          • ImSpartacus
          • 4 years ago

          Jesus Christ, I would die if that happened.

          And hell, gpu makers have done stupider marketing videos in the past…

          I’ve got my fingers crossed.

            • sweatshopking
            • 4 years ago

            What does Jesus have to do with it?

            • llisandro
            • 4 years ago

            I was always taught that microstuttering was God’s punishment for Adam eating that apple?

            • Kougar
            • 4 years ago

            AMD was Conroecified, and on the third power of three year they will rise again and be saved by The Wasson Sr, The Wasson Jr, and the holy grail of good driver optimization.

          • Redocbew
          • 4 years ago

          A new advertising campaign featuring Customer Man!

          [quote<]I don't always play games, but when I do, I prefer predictable frame pacing. Stay consistent, my friends.[/quote<]

          • tipoo
          • 4 years ago

          That would be amazing

      • USAFTW
      • 4 years ago

      Hell yeah! Radeon GPUs with drivers approved by no other than sir Scott ‘Damage’ Wasson. I don’t know about you but Scott joining AMD has me more interested in Radeons than Geforces from now.

      • drfish
      • 4 years ago

      How does this not have moar upvotes? Need to think of other options, “Radeon dAMageD Edition”?

        • llisandro
        • 4 years ago

        I fully support more people getting Gold subscriptions in order to get me some moar upvotes 😉

        C’mon people, it’s Christmas, and it’s pretty clear Gyromancer has a healthy hardware appetite, help his dad out!

        • K-L-Waster
        • 4 years ago

        New For 2016: the Radeon D@m@ge Boxx.

        just for you, Customer Man.

      • Mr Bill
      • 4 years ago

      My first thought was hmmm, perhaps another value reason to try an R9 380X.

      • fhohj
      • 4 years ago

      I just pictured Scott in a trucker hat and KC jersey holding a gpu with white gold and diamond encrusted cooler and making a finger gun at it or a number 1.

      thanks for that.

        • llisandro
        • 4 years ago

        ahahahahhah! That’s what I’m talkin ’bout! Scott bringin in a new era of blue-colored AMD branding…

      • llisandro
      • 4 years ago

      Honestly though, a Damage-branded FreeSync+GPU combo would be pretty flippin sweet!

      • auxy
      • 4 years ago

      You know, I realize this is a joke, but I really would buy an R9 DAMAGE. (≧▽≦)

        • llisandro
        • 4 years ago

        Sorta not a joke, right? Motherboards use random overclockers the vast majority of consumers have never heard of to pimp their features. TR readers laugh at it, but also realize they aren’t the target demo for this kind of stuff.
        For a GPU manufacturer with (arguably) a history of second-best drivers it might make a lot of sense to slap some “hey this was optimized by some expert guy” advertising on their lower-end stuff. Especially if they think NVIDIA will be first-to-market with sub-28nm products- focus marketing on “the experience” rather than raw FPS, especially with NVIDIA still trying to wring more profit out of G-sync.

          • auxy
          • 4 years ago

          That’s… pretty good thinking! (´・ω・`)

    • Flapdrol
    • 4 years ago

    Saves $5 on a cable I guess.

    • Bauxite
    • 4 years ago

    Dear AMD, Hurry up and add hdmi 2.0a to your product lineup, because TVs are going to be doing HDR etc a [b<]lot[/b<] more than monitors, at least for awhile. My JS9500 from april already supports it.

      • ImSpartacus
      • 4 years ago

      Definitely. We keep hearing about how amd seems to be going about vrr the “right” way and that mass adoption is imminent. However, it’s still not there and I’m not sure if it will be for a couple years.

      I personally don’t want to invest in a vrr monitor right now because I don’t want to accidentally pick the “losing” standard and then get locked into one gpu brand. I change gpu brands too often.

      Maybe Nvidia will buckle and support freesync, but maybe not? They have the majority of gaming gpus on the market and that gives them a lot of power to push things in the direction they want.

        • xeridea
        • 4 years ago

        FreeSync is not locked. It is an open standard, and Intel will be supporting it. Nvidia is being stubborn hoping it will go away so they can sell their overpriced Gsync tech to lock you into their cards, so that is what I would worry about. Gsync can never become the “standard” when it is locked. It’s not like BluRay vs HD-DVD or VHS vs Betamax.

          • EndlessWaves
          • 4 years ago

          It wouldn’t be the first time consumers had chosen a proprietary standard over a joint industry effort.

          Although I do agree with you about the likely outcome in this case. With the huge price difference G-sync is effectively dead at the moment and it would take either major innovation on nVidia’s part or major collapse from AMD to stop the momentum from DisplayPort Adaptive Sync.

            • Airmantharp
            • 4 years ago

            With G-Sync, you’re paying more to get more- for now.

            AMD is going to have to work hard with panel makers and ASIC/scaler makers to broaden the refresh rate support, and generally speaking, gamers that are buying G-Sync displays only want and need the single DP input, while Nvidia has already included more inputs with the current G-Sync revision.

          • ImSpartacus
          • 4 years ago

          Then why would anyone even consider getting a gsync monitor?

          As I said before, I’m not making the major decision purchase a monitor until things settle (I keep monitors for many years), but I constantly hear people yapping about getting vrr monitors and gsync is mentioned as often as free sync.

    • RoxasForTheWin
    • 4 years ago

    Now if only their mobile gpus were competitive with the current 900ms

      • ImSpartacus
      • 4 years ago

      Yeah, they have a long way to go to get efficiency up in an affordable manner.

      They’ve already shown that could take fiji and drop clocks to at least play ball in the efficiency-minded markets. However, that’s an expensive strategy and it still doesn’t blow gm204 out of the water. Maxwell is just really efficient.

      • Hattig
      • 4 years ago

      Wait, did you not read the article? “Based on those specs, the M380 should slot in somewhere between the GeForce GTX 950M and GTX 960M.”

      Regardless, HDR displays and DP1.3 are definitely worth waiting for.

      Enough to put off buying a monitor in the near future anyway, unless you really need one. A 4K HDR Freesync monitor is just going to tick most boxes, unless you are stuck with a backwards non-Freesync capable system*.

      * 2017’s hindsight viewpoint of the current situation

Pin It on Pinterest

Share This