Intel 25.20 IGP drivers are ready for the latest Windows 10 update

Intel just released a new graphics driver right on time for the Windows 10 October 2018 update. The driver is numbered 25.20.100.6323 and adds support for the WinML machine learning framework and HDR10 output for laptop internal displays. As usual, this latest release also adds game-specific support for a few titles and fixes a few bugs.

Driver 25.20.100.6323 is a full WDDM 2.5 driver with support for DirectX 12 Shader Model 6.3 as well as HDR10 output. Some Intel IGPs already supported outputting HDR10 to external displays, but not to built-in laptop panels. Intel notes that this driver also improves the quality of the dynamic-range-expanding EDR mode.

Zone of the Enders the 2nd Runner: M∀RS should run well even on slower Intel graphics hardware.

WinML is—as savvy gerbils have no doubt already surmised from the name—the Windows Machine Learning framework. It implements the ONNX format for neural network models and is supported on every Windows 10 device running the 1809 update that just dropped. Intel's driver adds support for running WinML applications on its graphics processors, but only on systems sporting Kaby Lake and later processors. Intel says that doing so can bring "a substantial speedup" to machine learning applications, presumably in comparison to running them on the CPU alone.

We're all about gaming on discrete graphics cards around here, but I've played quite a few games on my 8700K's built-in GPU. It's pretty competent, and if I so chose, I could now enjoy official support for NBA 2K19, PlayStation 2 port Zone of the Enders the 2nd Runner: M∀RS, or the Bruno-recommended Two Point Hospital. Intel notes that playing these games is only recommended on Intel UHD Graphics 620 or faster adapters.

Wolfenstein II: The New Colossus probably won't look this good on your IGP.

There are a fair few bugs fixed in this release. Some long-standing graphical anomalies in personal-favorite Phantasy Star Online 2 have finally been addressed. Crash bugs in Wolfenstein II: The New Colossus and Adobe Premiere Pro CC 2019 should be crushed. Panel Self Refresh should re-enable properly on laptops after watching full-screen videos. Miracast should once again go full-screen, and "multi-refresh-rate" displays should stop flickering intermittently. Finally, virtual machines should stop blanking their displays when testing 3D benchmarks.

All of the above goodness is only available for machines running Windows 10 64-bit on Skylake and newer CPUs. For the uninitiated, that means sixth-, seventh-, and eighth-generation chips. You can check the full list of supported IGPs in the release notes. Interested users can go right ahead and grab the 25.20.100.6323 driver from Intel's download site.

Comments closed
    • CScottG
    • 1 year ago

    Now if they’d just fix all the bugs with their video drivers and Linux.. like that one annoying little bug that drops xorg to the command-line (among a host of other problems).

    • Chrispy_
    • 1 year ago

    I think they’ve screwed up and pulled the 25.20 driver.

    I went to the link and the PAGE says “Version 25.20.100.6323” but the download links are the .zip and .exe versions 24.20.100.6323.

    Maybe the download files have a typo in the name, I’ll find out in a couple of minutes.

      • Chrispy_
      • 1 year ago

      [i<]EDIT: Yup, the 24.20.100.6323 driver downloads are indeed the 25.20.100.6232 drivers. [b<]'Top notch' quality control[/b<] by the Intel driver team as usual, I guess that's a sign of the quality of the rest of the driver :\ I'm only installing it to see if it fixes the longstanding issue with custom resolutions and refresh rates being exposed properly to the OS and Applications on my i3-7100U. Expect an update saying that it's still broken in another couple of minutes![/i<]

        • Chrispy_
        • 1 year ago

        OMG. It’s not actually [i<]broken[/i<] anymore. New Colossus runs at minimum settings 480x280 (technically the HUD is 960x540, but dynamic resolution is forced to the minimum of 0.5) [b<]GOOD NEWS EVERYONE! If your Intel laptop is 7th Gen Intel CPU or newer, you can now [i<]'run'[/i<] Wolfenstein II: The New Colossus at 15-20fps at 480x280 resolution![/b<] What a valuable contribution to mobile gaming this has been: [i<]AAA like you never experienced before, unless we're referring to a Game & Watch powered by AAA batteries, ofc.[/i<]

          • Airmantharp
          • 1 year ago

          I mean, I’ve gotten 60FPS easily with eSports games like League of Legends at 1080p, so any improvement is certainly welcome but I feel that they’ve been doing pretty good so far with such limited hardware!

            • Chrispy_
            • 1 year ago

            The one game I want to run smoothly on my laptop is Diablo III.

            Latest Intel drivers still can’t deliver a reliable 30fps on minimum detail 720p. I can manage 40fps when nothing’s happening on screen (standing around in town) but actual gameplay is 20fps ish and really no fun like that.

            • DavidC1
            • 1 year ago

            To tell you how badly your HD 530 laptop is running, here’s a comparison of a laptop using the low power HD 520:
            [url<]https://www.notebookcheck.net/HP-ProBook-430-G3-Notebook-Review.154359.0.html#toc-energy-management[/url<] It's getting 36.1 fps average in 1366x768 High settings, and that's on single channel memory. Weird throttling like that will cripple your system more than specs will, even things like single vs dual channel memory. The HD 620 with dual channel memory and optimal settings can get nearly 30 fps on 1080p HD Ultra. [url<]https://www.notebookcheck.net/Dell-XPS-13-9360-QHD-i5-7200U-Notebook-Review.178844.0.html#toc-performance[/url<]

          • DavidC1
          • 1 year ago

          I’m not sure how you are running it that bad.

          A youtube video is showing it running at 25-30 fps at 1280×720 low. That’s substantially better than what you are running at.

          Granted its on the HD 630, but Intel is using the same GPU since 6th Gen core.

            • RAGEPRO
            • 1 year ago

            The laptop chips run MUCH lower clock rates on their IGPs, which probably has an effect. But yeah, it shouldn’t be running anywhere NEAR that badly.

            • Chrispy_
            • 1 year ago

            Like most laptops, it’s single-channel. Sad but true 🙁

            Clocks are indeed pitiful on the 7100U’s IGP, that’ll be the 15W TDP not helping matters compared to the 65W or 95W desktop HD630.

            If you were gaming on a desktop though, you’d buy a GPU. If budget really was ridiculously tight, a $50 R7 260 on ebay or craigslist will run New Colossus at 720p60 medium, which makes a mockery of Intel’s 720p30 low.

            • DavidC1
            • 1 year ago

            There’s usually about 50% difference between the 15W’s HD Graphics and the fully fledged one. Which is large, but still not as dramatic as you describe. You don’t need anywhere near 65W to get full performance. 35W gets you 90% there. A desktop part is maybe 10% faster on top of that.

            Stress tests show the GT2 version of the HD graphics can generally run at about 1GHz, which isn’t far off from 1.2-1.3GHz its capable of.

            I guess the single channel could make it that bad. Other than that I have to say something’s really wrong with the system settings, or something. Laptops often need to use special drivers that are substantially behind the official ones.

            • Chrispy_
            • 1 year ago

            Wishful thinking, but even the HD530 on this i7-6700HQ isn’t getting much above 600MHz, and that’s a 45W chip, not a 15W chip. Not that it matters, because it also has a 1060 Max-Q in it, but for academic purposes, the “up to 1050MHz boost clock” of this HD530 IGP is pretty optimistic about those boost clocks. I guess that’s why Intel lists it as 350-1050MHz boost, because it’s highly variable depending on the temperatures and TDP remaining after the CPU has done what it has to.

            For the most part, people using IGPs are on laptops, so IGP performance with the desktop TDP is irrelevant 🙁

            • DavidC1
            • 1 year ago

            It’s not wishful thinking, your system is running terribly.

            [url<]https://www.notebookcheck.net/Dell-XPS-13-i7-8550U-QHD-Laptop-Review.257650.0.html#toc-energy-management[/url<] [url<]https://www.notebookcheck.net/HP-ProBook-x360-440-G1-i5-8250U-256GB-FHD-Touch-Convertible-Review.333474.0.html#toc-power-management[/url<] [url<]https://www.notebookcheck.net/Dell-Latitude-12-5285-2-in-1-Convertible-Review.216862.0.html#toc-energy-management[/url<] A 7W enabled Y-chips can reach nearly 700MHz, higher than your laptop: [url<]https://www.notebookcheck.net/Dell-XPS-13-9365-2-in-1-Convertible-Review.193704.0.html#toc-energy-management[/url<] Many 15W U systems can run at 950-1000MHz. Your iGPU is throttling because of some weird firmware/driver behavior. That's the tricky thing about computers, especially Windows ones when it throws something at you that makes absolutely no sense. I've seen an XPS Pentium M laptop where it was stuck at LFM frequency of 800MHz. I couldn't figure it out at that time.

            • Chrispy_
            • 1 year ago

            All three of those links clearly point out that the CPU is pushed down to unacceptable levels when running those synthetic GPU tests.

            You cannot compare something like Unigine or Furmark to a real game. Games don’t like the CPU speeds those “high GPU clock” synthetic tests create.

            In order, your three links have the
            – Dell XPS CPU stumbling along at only 1.4GHz, despite a 25W TDP in that state
            – HP x360 rapidly throttling when up-configured, averaging just 1.0GHz CPU/ 766MHz at 15W
            – The Latitude’s i7 is 1.2GHz below its [i<]base[/i<] clock(!), whilst still 200MHz short of it's advertised GPU boost (1150MHz) ...and that 7Y75 GPU runs at 700MHz only if the CPU is at 1.1GHz on a single-thread synthetic. These are your links I'm pulling the numbers from. My laptop may well suck, but it's typical/average/median because it matches up with other laptops (Clevo at work, Gigabyte P34 at work). I'm sure all these laptops can run Unigine and other single-threaded CPU graphics tests like Furmark at higher GPU clocks, but real world gaming (let's get back to the start of this discussion - New Colossus) needs three, ideally four CPU threads at clockspeeds definitely higher than ~1GHz of a synthetic GPU-only benchmark. [i<]Edit: [url=https://youtu.be/CVBBx5da1nY?t=840<]Here's a video[/url<] of a very similar laptop to mine, running the absolute minimum possible settings like I was, 50% render at 960x540 custom res. He's barely managing to stay above 20fps on the first level which is tiny closed corridors and not representative of the majority of the game levels. My laptop comfortably managed 30+fps on that bit, but my savegame about two hours in is utterly unplayable at 15fps, as you'd expect.[/i<]

            • DavidC1
            • 1 year ago

            I’m getting something interesting.

            UHD 620: [url<]https://www.youtube.com/watch?v=QOiZEl6FrcY[/url<] UHD 630: [url<]https://www.youtube.com/watch?v=UoiKARXRlYQ[/url<] You can see the 8550U chip in the first link is only running at 1700MHz or so and uses DDR4-2400 memory. The 8400 desktop system is constantly running at more than twice the clock(3.8GHz) and using DDR4-3200 memory. Both running 720p Low. You can check it by watching the video side-by-side. Most of the time the results are same, except in one or two scenes where the desktop is running 10-20% faster. CPU running faster won't do anything when its GPU limited anyway.

            • Chrispy_
            • 1 year ago

            That i7-8550U does indeed seem to be within 10% of the desktop HD630.

            I would have to assume that i7 models being premium are less likely to get lumbered with single-channel designs by OEMs, and that only the most efficient chips are cherry picked to be i7s. The mainstream i3 and i5 models will likely be binned by power efficiency rather than binned for defects, given how mature the 14+++++++ process is now 😉

            It’s a shame that standardized testing is nearly impossible on mobile chips because each laptop vendor configures the chip differently, with minimal information exposed to the user on how it’s configured and minimum-tweakability BIOS settings on most of them.

            • DavidC1
            • 1 year ago

            If its running at 600MHz like you said, then it must be a bug of some sort. Back in the Ivy Bridge generation, enough users reported not going above 900MHz boost frequency(1.15GHz max), and with a combination of drivers and hardware tweaks they fixed it in later GPU generations. A game like Wolfenstein is cutting-edge enough that its not a focus for the driver team.

            But there’s generally a 50% difference between the U and the desktop ones. I’ve seen many benchmarks to see that true. Anything that greatly deviates from the margin indicates either a problem with the system or something else(drivers, optimization). Their GT2 parts are efficient enough to extract most of the performance in the 15W U envelope. If you go higher like with GT3e and GT4e that starts to change. Also, the i5s are generally the best performing as the CPU isn’t as low end as the i3 and the CPU doesn’t hog all the power like i7s.

            My main system is a Core i3 7100 with HD 630 graphics. I could test Wolfenstein 2 and have HWInfo open to see power and frequency data. Hope they have a demo or something.

            • DavidC1
            • 1 year ago

            So they have a demo for the New Colossus. I downloaded and ran a test. Also on Diablo III, and Starcraft II.

            New Colossus runs at 3.8GHz CPU and 1050MHz GPU. Gets about 20 fps on low. Interestingly, going from TSSAA 8TX to disabled AA doesn’t seem to impact performance in a noticeable way. Intel’s gameplay.intel page shows the older version of Wolfenstein listed as OK, but Wolfenstein II as not compatible, though it runs.

            I ran using an older driver, which is 23.20.16.4982. Used about 9.5W for the GT core. The total package power was only in the 25-30W range. I then updated the driver to the one listed on this article, 25.20.100.6323. It could be due to something else, but GT core power went up to 10-10.5W and seemed to perform about 10-15% better. It also runs at 1050MHz. According to Notebookcheck’s results, Furmark uses 16W or so for its maximum. Considering how GPU limited Wolfenstein II seems to be, they could afford to increase GT power to 12W for another 20-30% gain while keeping the CPU at 3-4W.

            Starcraft II uses about 8W, and Diablo III at 10W. Package power is all in the 20-25W range.

            Notebookcheck’s UHD 620 graphics section has a section about GPU core clocks. For two laptops running Witcher 3, its at 1050-1100MHz.
            [url<]https://www.notebookcheck.net/Intel-UHD-Graphics-620-GPU-Review-Benchmarks-and-Specs.239936.0.html[/url<] Another site, Ultrabook review, runs NFS: Most Wanted for their GPU stress tests: [url<]https://www.ultrabookreview.com/22870-dell-xps-13-9370-review/[/url<] Again, runs at 1050MHz, and GPU is using 14W in one screenshot. I know from experience with their iGPUs that the desktop parts waste power on the CPU side. The 15W U chips can perform at 60-70% of the 65-95W brothers in games despite the CPU running twice the frequency, and using 4-5x the power. The high clocks can help if you are getting 40 fps or more but below that, you are really wasting them as its limited by the GPU.

    • Krogoth
    • 1 year ago

    Raja: Now witness the superiority of our driver platform!

    • DancinJack
    • 1 year ago

    So, does anyone know, does a WDDM 2.5 update require work from Nvidia and AMD if those are the companies you rely on for driver updates? I’d assume so, but things aren’t always as easy as they seem.

      • morphine
      • 1 year ago

      Well, I’ve already upgraded to the latest Windows Update and so far I’m not seeing ill effects. Using a Nvidia card.

        • DancinJack
        • 1 year ago

        Yeah, me too (GTX 1080 + 1809). I did have to reinstall the latest Nvidia driver because there was some weirdness going on with OpenCL + FAH, but that’s the only thing I noticed thus far. Nvidia hasn’t released an update since 1809 though, so maybe something is coming soon.

          • EzioAs
          • 1 year ago

          [quote<]I did have to reinstall the latest Nvidia driver because there was some weirdness going on with OpenCL + FAH[/quote<] I found that always the case with every Feature Update. I just updated to WHQL 411.70 and everything seems fine so far. I was reluctant to update to this driver version because apparently there are problems with Firefox but again, everything seems to be working fine.

Pin It on Pinterest

Share This