New Intel IGP drivers add H.265, VP9 hardware decode support

Intel graphics drivers releases aren't usually very exciting, but this one is worth pointing out. Numbered 15.36.14.4080, this update imbues Haswell and Broadwell processors with hardware acceleration support for two next-gen video formats: HEVC, also known as H.265, and Google's VP9. Here's the relevant snippet from the "new features" section:

  • Improved video playback through partial hardware acceleration support for the VP9 video format.
  • GPU accelerated decode of HEVC video file format including both 8-bit and 10-bit support. This will provide improved video playback capabilities on the platforms.

Supported processors include the Core M family; 4th-gen Core CPUs with HD, Iris, or Iris Pro graphics; and "Select Pentium®/ Celeron® Processors."

Note that the VP9 acceleration is partial, so decoding won't be fully offloaded from the CPU cores. Still, I'm curious to see the power and performance implications of these new capabilities. If history is any indication, hardware-assisted decode should enable smooth playback at higher bitrates and resolutions, and it could also cut power use and extend battery life.

Incidentally, the drivers also add support for a handful of OpenCL and OpenGL extensions. One of those extensions, cl_intel_advanced_motion_estimation, "provides access to Intel's motion estimation hardware acceleration block that can be used by media processing applications including custom transcoders and image stabilization." Nifty. (Thanks to TR reader SH SOTN for the tip.)

Comments closed
    • edwpang
    • 5 years ago

    I remember ATI once did the same thing with H.264 decode using pixel shaders quite a while ago, but eventually dropped and went fully to UVD solution.

    From Wikipedia:
    The Unified Video Decoder (UVD) SIP core is on-die in the HD 2400 and the HD 2600. The HD 2900 GPU dice do not have a UVD core, as its stream processors were powerful enough to handle video acceleration in its stead…

    • psuedonymous
    • 5 years ago

    [quote<]cl_intel_advanced_motion_estimation[/quote<]I remember when intel first announced Quicksync, there was a bit of a stink about them not letting x264 access any of the fixed function blocks in order to accelerate decoding, leaving things in the state where you could use Quicksync (and have shite quality video arrive quickly), or x264 (having shite quality video arrive even faster, or good quality video arrive slower). It wasn't until Haswell's Quicksync implementation that Intel managed to beat x264 on maximum encoding speed, and Quicksync's encoding quality setting still only go from 'warm ass' to merely 'ass'. If this is a sign of Intel opening up the backend Quicksyns uses to other programs, this could provide a nice speedup to encoders, assuming there aren't any hidden gotchas.

      • rootheday3
      • 5 years ago

      Pretty sure QuickSync has been faster than x264 all long and has dramatically increased in speed since Sandybridge (while CPU perf hasn’t jumped that much)- particularly if you factor in 2 core CPU vs 4 core; plus GT2 on all core I skus on mobile and desktop (vs just mobile and few desktop with Sandy); plus thermal envelope (perf/watt).

      Quality has also gotten better too.

      Not saying that opening up the motion estimation hw isn’t a good thing for ISVs to use for encode or image stabilization or …

      But I am guessing you haven’t seen any recent perf/ quality data for HSW 2C + GT2 to make your claims….

    • tipoo
    • 5 years ago

    Most reviews of the Iris Pro were with the version 9 driver, they’re now several largeish revisions later at 15.something. I’m kind of curious to see what the gains were over that time, I’ve heard some talk that they match the 650Ms they competed with after all the updates.

    Especially as these chips have an automatically managed 128MB eDRAM cache, different ways of managing what goes in and out could have huge performance changes. Still curious why they never put buffers in there, they would not take much of the excessive 128MB.

    • Norphy
    • 5 years ago

    Installed these drivers on my Dell Venue 11 Pro (7139). They did odd things™.

    PowerShell ISE wouldn’t open, came up with a JIT error and tried to open a debugger.
    Windows Photo Viewer just plain wouldn’t open.
    My helpdesk software wouldn’t either.
    The Intel Driver Control Panel also wouldn’t open.

    Rolling back to the old driver fixed it.

    Oh well, will have to wait for Dell to certify the driver I suppose.

    • Ninjitsu
    • 5 years ago

    Well, there goes Carizzo’s week-long H.265 performance advantage over Broadwell. XD

      • chuckula
      • 5 years ago

      In fairness to Carizzo, a fixed-function decoder is more efficient. Given that AMD is shipping Carrizo about 8 months after Broadwell hit the market, they at least used to the extra time to get HEVC decoding implemented in fixed-function hardware.

      Skylake will have HEVC decode too, so by 2016 most new chips will have acceleration for HEVC.

        • lycium
        • 5 years ago

        Sure it’s more efficient to have fixed function for H265, but it’s better overall to have a bigger pool of generically usable units. That’s why we went from having separate vertex and pixel shaders to pipelines capable of doing both.

        Then again, there are arguments to be made in favour of “dark silicon”, increasingly specialised units that are only used sometimes, and if you basically only do video decoding (eg TVs) then it’s obviously best to have it fixed-function.

    • jackbomb
    • 5 years ago

    I wonder if they’re using GPU shaders to assist the video decode block with H.265 (which would be similar to NVIDIA’s approach) or if Haswell/Broadwell’s video block fully supports the new codec.

      • chuckula
      • 5 years ago

      It’s GPU shaders. The fixed-function hardware already does H.264, but fixed-function HEVC decode support doesn’t show up until Skylake.

    • sweatshopking
    • 5 years ago

    downloading to get test on my 4790k and r9 290. will let you know if it makes a difference.

    IT DOESN’T. STILL WORKS.

    edit: can’t actually install. it tells me my GPU doesn’t meet the minimum requirements. No idea why.

      • NeelyCam
      • 5 years ago

      [quote<]can't actually install. it tells me my GPU doesn't meet the minimum requirements. No idea why.[/quote<] Because it's Radeon...?

        • sweatshopking
        • 5 years ago

        yeah, but I assumed I should still be able to install the drivers for my onboard gpu.

        • Andrew Lauritzen
        • 5 years ago

        Yeah it’s pretty common for desktop motherboards to entirely disable the integrated graphics in the BIOS is a discrete card is present. You may or may not be able to control this with a BIOS setting.

          • sweatshopking
          • 5 years ago

          Yeah, I see it’s not actually even listed in device manager. On my previous MSI board it showed both. Seems my UEFI disables it entirely! THANKS GIGABYTE.

            • Flapdrol
            • 5 years ago

            probly have to turn it on in the uefi, or plug in a screen 🙂

            • derFunkenstein
            • 5 years ago

            Both, if my Gigabyte-based H87 setup indicates anything about SSK’s. If nothing is plugged in at boot time, it doesn’t show up in device manager, even if you plug in a monitor later.

            • VincentHanna
            • 5 years ago

            ah well, if you are really interested in it, Gpus are easy enough to unplug.

            All you need is a hammer and some tin snips and you’ll be golden.

          • DPete27
          • 5 years ago

          I’m pretty sure you can always(?) select in BIOS what the motherboard boots with (Auto, IGP, or discrete)….unless of course your CPU doesn’t have an IGP.

            • Andrew Lauritzen
            • 5 years ago

            Right but some BIOSes will refuse to enumerate both (via UEFI), so selecting the IGP will effectively disable the discrete card and vice versa, even though Windows can run perfectly fine with both enabled.

            • auxy
            • 5 years ago

            Have never seen that in the modern (uefi) era. Been using IGP alongside dGPU since Sandy Bridge.

            No reason to disable it since AGP went away.

            • Andrew Lauritzen
            • 5 years ago

            Yeah it’s a grab bag depending on the motherboard but I’ve seen the issue on a few. More common on laptops, and on Macs it seems entirely unavoidable/unconfigurable (it always shuts off IGP if discrete is present) :(.

            • Deanjo
            • 5 years ago

            [quote<]and on Macs it seems entirely unavoidable/unconfigurable (it always shuts off IGP if discrete is present) :(.[/quote<] No it doesn't. My MBP with it's nvidia graphics says otherwise. It is only in windows that it gets disabled.

            • Andrew Lauritzen
            • 5 years ago

            Sorry I wasn’t clear, I meant on Windows. You can only boot Windows in BIOS mode (last I checked) and they entirely disable the IGP when booted that way. You can’t even enumerate it as a PCI-E device let alone install a driver…

      • Deanjo
      • 5 years ago

      Must be your inferior OS. Works fine in linux.

      • maxxcool
      • 5 years ago

      Quicksync off ? Not exposed to the driver ?

        • Deanjo
        • 5 years ago

        It is a limitation of Windows, dead serious about this. When a video card is added to a windows system with a quicksync cpu those features are disabled. In linux and OS X the limitation is not there and the full features of QuickSync are still available. His AMD card with UVD is also well behind the competition in support of the newer codecs.

          • Voldenuit
          • 5 years ago

          There are some simple workarounds for this.

          I have a GTX 760 and have quicksync encoding enabled on my 4560K (though I prefer to use CPU x.264 for IQ).

          • Andrew Lauritzen
          • 5 years ago

          Windows can handle this fine, it’s actually certain BIOSes that will disable one or the other PCI-E device. See for example folks who can game w/ a discrete GPU and use quicksync to encode and stream to twitch on the fly.

            • Deanjo
            • 5 years ago

            It’s not the BIOS’s that stop it. Those people that are doing it are using the Virtu MVP GPU feature of their motherboard which is a software shim that will allow QS to be used with their discrete card.

            [url<]https://techreport.com/news/20217/lucid-software-lets-quicksync-video-mix-with-discrete-gpu[/url<] The other dirty little hack is to setup a fake display and put the second screen out into never never land. [url<]https://mirillis.com/en/products/tutorials/action-tutorial-intel-quick-sync-setup_for_desktops.html#top[/url<]

            • Andrew Lauritzen
            • 5 years ago

            You’re talking about something different here than I am. You’re referencing software layers that are trying to do “magic multi-adaptor aggregation” stuff. Fine, but messy and irrelevant. I’m just talking about exposing the adaptors themselves to applications which if Windows did not handle just fine, these shims would not even work.

            Windows 8 (.1?) fully supports headless adaptors and a 3D application can enumerate and create a device on one.

          • Klimax
          • 5 years ago

          And wrong about Windows. If device is not exposed by UEFI, it is not useable. Not even your magical Linux won’t help you in that case as disabled device is disabled device and there is no way around. (No assigned resources -> no way to communicate)

          As for multi-GPU installs, possible since Vista. (One of reasons for WDDM) XP had limited support.

      • crabjokeman
      • 5 years ago

      CAN’T TELL IF TROLLING OR BRAIN CELL DEFICIENCY

        • sweatshopking
        • 5 years ago

        BOTH.

          • crabjokeman
          • 5 years ago

          I guess they are comorbid. :/

            • sweatshopking
            • 5 years ago

            NOT COMORBID, SYMBIOTIC.

      • albundy
      • 5 years ago

      hahaha! nice! can’t believe people are falling for this.

    • w76
    • 5 years ago

    That’s a nice plus, x265 @ 4K puts an embarrassingly heavy load on my Sandy Bridge era chip. In fact, it’s the first mundane sort of task to do so. Still must be a nice for the newer chips, despite having the additional performance headroom.

      • willmore
      • 5 years ago

      Do you mean h.265? x265 is an h.265 encoder program that will tax any machine.

        • w76
        • 5 years ago

        Doh, yes. My mistake. It had x265 in the file name, indicating encoder, and my brain is getting old. (CLEARLY IT WAS LEGIT CONTENT I WAS VIEWING)

          • crabjokeman
          • 5 years ago

          Content legitimacy (much like ethics) is in the eye of the beholder.

            • willmore
            • 5 years ago

            [quote<]Content legitimacy (much like ethics) is in the eye of the copyright holder.[/quote<] FIFY

          • Pwnstar
          • 5 years ago

          CLEARLY

    • chuckula
    • 5 years ago

    Will try it out and let you know how some HEVC playback works on my Helix 2.

      • Voldenuit
      • 5 years ago

      So jealous. Hope you’re loving it!

        • chuckula
        • 5 years ago

        Given the form factor it comes in, I really like the Core-m. Once again, compared to higher-power consumption chips (especially the new Broadwell-Us) it’s not a speed demon. However, given that I upgraded from a Core2 era notebook with a gm4500 chipset GPU, it’s a massive upgrade in both performance and battery life.

          • I.S.T.
          • 5 years ago

          Wow. Must be like a, what, 50% performance increase minimum?

          • Duct Tape Dude
          • 5 years ago

          [quote<]Core 2 era notebook with a gm4500[/quote<] Oh Penryn. That CPU/chipset era is still my favorite of all time.

            • chuckula
            • 5 years ago

            Yup, you got it.

            The old tank still works and I use it as a wireless bridge.

          • Flying Fox
          • 5 years ago

          How’s battery life these days?

            • chuckula
            • 5 years ago

            For standard office-type work with wireless networking for web/email/etc. it’ll do 8 hours. That number goes down if you are railing the processor with more intense stuff, but for what people consider “standard” operation with a light notebook/tablet it gets the job done.

            Lenovo is supposed to be coming out with a better keyboard that includes an additional battery in the near future, and I want to grab one of those to push the battery life closer to 12 hours.

Pin It on Pinterest

Share This