Intel graphics driver 15.46 fixes a slew of games

Intel's latest graphics driver release is a pretty big one. The new driver, version 15.46, adds game-specific support for three new releases: Epic's building-and-survival game Fortnite, Motiga's hero shooter Gigantic, and season 2 of Telltale's Minecraft: Story Mode. A number of other games and applications also got fixes, and this release also expands the API support of Intel's graphics processors, too.

Epic Games' Fortnite

Pyre, Master x Master, Secret World Legends, Pit People, Guilty Gear Xrd, Fallout 4, Euro Truck Simulator 2, Guild Wars 2, Deus Ex: Mankind Divided, Vikings—Wolves of Midgard, Rainbow Six Siege, For Honor, Watch Dogs 2, and apparently shut-down MMO Lego Minifigures Online are the rest of the fixed-up games mentioned by name, but there's also an "and other games" note as well. A number of the listed titles are based around the Unreal Engine, so it's likely any game using that middleware will benefit from the new driver. Intel mentions that Halo 2 "and other DX9 games" should no longer suffer texture flickering.

Additionally, Guild Wars 2, Dota 2, and The Talos Principle should stop crashing. The latter two games only suffered crash bugs in Vulkan mode, and it's likely that the new driver's support for Vulkan 1.0.38 has something to do with the fixes. Driver version 15.46 adds support for OpenGL 4.5 and widens the Intel hardware's support for OpenCL and DirectX 12 optional extensions. Intel says it added support for the Computer Vision SDK and Deep Learning Deployment Toolkits, as well.

We should note that this new driver is only for Intel's ninth-generation graphics hardware, which means the IGPs of SkylakeKaby Lake, and Apollo Lake processors. Older chips will have to use a prior driver version. (The newest driver for Haswell and Broadwell chips is 15.40, released in March.) If you want to read the full release notes, you can grab the PDF, or you can head on over to Intel's download site to simply grab the driver.

Comments closed
    • DavidC1
    • 2 years ago

    Klimax:

    People actually give the hardware of the GMA X3000 too much credit. Drivers were an issue but to say it was crap entirely because of it is not 100% true.

    The GMA X3000 suffered because Intel has the same mentality as Microsoft. There’s an adage saying Microsoft’s decisions make sense if you consider they do everything to make Windows successful. Well, Intel does everything to make their CPUs successful.

    So, they put a crappy hardware geometry engine on there. That’s how you get it to scale with CPUs. Why bother putting hardware geometry engine and Shader Model on a iGPU? Compatibility. People will accept slow performance but not the lack of being able to run it.

    That wasn’t the only fault. The GMA 950, the predecessor had 4/1(pixel/texel) pipeline setup. They moved to basically a 2/1 setup. To save die space and manufacturing costs. I also speculated back then the GMA 4500HD would improve performance a decent amount because one of the big changes were the 2x geometry performance. To my surprise, it did not do as much as I was expecting. The improvement due to 2x geometry just meant you didn’t need to resort to software(CPU) based geometry like you did with X3000. Gain = 50-70%

    The real fault revealed itself gradually. Ironlake, the GMA HD in Arrandale/Clarkdale Core processors, provided the answer. You see, with Extreme Graphics, Extreme Graphics 2, and GMA 900/950, they had immediate mode, but tiled rendering(which is why people believed it used PowerVR tech but PowerVR is deferred tiled. Intel pretty much developed their own). It helped a lot in saving bandwidth. With X3000, they gave that up.

    Unfortunately they did not do much on the X3000 regarding bandwidth saving features. It was pretty much nil compared to competing solutions. Because of that probably even the geometry performance suffered. Ironlake used much improved occlusion culling technologies such as Fast Z Clear and Hierarchical-Z, which is what ATI used then. Performance jumped! Gain = 2.5x

    • nerdrage
    • 2 years ago

    Don’t really care about the drivers but… Fortnite looks pretty cool!

      • Demetri
      • 2 years ago

      I put some time into the closed alpha and it’s pretty fun. One of the coolest things for me was using the fort building options while scavenging to get access to secret locations. So if you found a house, you could build a staircase outside that would get you onto the roof, and then use the harvesting tool to make a hole in the roof, jump down into the attic and find all kinds of goodies.

    • ronch
    • 2 years ago

    Fixing DX9 issues in 2017.

    Better late than too late.

    • blahsaysblah
    • 2 years ago

    [quote<]"Windows® 10 Creators Update features enabled (7th Gen Intel® Core Processors only)"[/quote<] What features are they?

    • ludi
    • 2 years ago

    The fix is in.

    • Topinio
    • 2 years ago

    Has Intel dropped support for Broadwell and earlier in its graphics drivers, then?

    There seems to have been no support for earlier than Skylake since March, 15.40 … how’s it justifiable to only support iGPU’s going back 23 months?

      • Captain Ned
      • 2 years ago

      I’ve got a Haswell i4790K and the 15.40 driver refuses to install on Win8.1, so I’m stuck at 15.34. A bunch of grepping tells me it would install under Win10, but I’m not ready for that.

    • CuttinHobo
    • 2 years ago

    I know Intel’s IGP performance has made progress over the years… but Fallout 4 and Watch Dogs 2? My derisive snort was so derisive and so snorty that it was painful and I oughta sue.

    You’ll be hearing from my lawyers at Dewey Rippem & Howe.

      • chuckula
      • 2 years ago

      What happened to Cheatem?

        • CuttinHobo
        • 2 years ago

        From what I read, Rippem promoted himself by poisoning Cheatem’s caviar. Really have to be wary of the junior partners on their way up.

      • DavidC1
      • 2 years ago

      It’s not that bad. If you run them low enough settings even the demanding games run at playable frame rates. And I mean by 30 fps, not 120+ some expect it to be. It also comes for zero cost with your CPU.

      Honestly some people have such high standards for some things its borderline spoiled. There’s a place for everything, not everything has to be my way or no way.

      I used to play WoW with Celeron D CPU and GMA X3000 graphics. Since the geometry performance scaled with CPU on that hardware, it performed particularly bad. Average fps must have been less than 20. Yes, I felt the lag, but I did instances. Leveled from 1 to 60 using it.

      Buddy of mine used to play original Tribes on a Pentium 166 overclocked to 200. Lagged like heck but beat the crap out of most of us.

      Another annoying thing is people’s insistence that Anti-Aliasing settings make world-changing differences, at least they express it that way. It honestly doesn’t make any difference if you are actually PLAYING it. Acceptance goes a long way actually.

        • CuttinHobo
        • 2 years ago

        It can run them, sure. I was gaming on a 386 well into the Pentium era so I know where you’re coming from. I’m by no means a 120fps-or-go-home guy, but I believe that if gaming is any kind of priority for you then you should do yourself a favor and save up for a modest graphics card with a solid price/performance ratio.

          • DavidC1
          • 2 years ago

          Tribes on a Pentium 166 @ 200 is a literal slideshow. From a different player’s point of view, his character was jumping from one position to another. We’re talking about 5-10 fps.

          It was very amusing to see how when he got a latest computer he ran old games like Quake 3 on 8800 GTX at low settings because he was so sensitive to lag. I guess you need to know lag to notice it.

        • Klimax
        • 2 years ago

        GMA X3000… one of more interesting IGP series torpedoed by drivers. (Failure to use SSEx in most of code) Slight change to CAPS bits (especially under Vista/7 because of translation of fixed pipeline to shaders) and it would run all games back then. And quite well too.

        In fact most of IGPs were held back by drivers. A pity…

      • tipoo
      • 2 years ago

      The Iris Pros/Plusses with eDRAM could certainly run those.

      I’m wonder if Pro isn’t a dead line actually, it would be interesting to see how they scale up at least.

        • chuckula
        • 2 years ago

        They are still alive in higher-end mobile SKUs but outside of mobile Intel has shown a great reluctance to actually release a strong IGP.

        They certainly have the technical capability to compete with the upcoming RyZen APUs but I don’t think they care enough to want to do it.

          • tipoo
          • 2 years ago

          There’s no Kaby lake or Skylake Pro, I don’t think. The upper tiers seem to just go with Iris Plus at best, with half the eDRAM.

          Maybe there was just no point in 128MB though.

        • CuttinHobo
        • 2 years ago

        Yeah, if Intel cared to try. They’re clearly content with barebones office-level graphics horsepower.

          • tipoo
          • 2 years ago

          Unfortunate. I’ve heard the gen graphics team wanted to do a dedicated card, but upper management had no will for it.

      • RAGEPRO
      • 2 years ago

      I assure you Intel HD 630 can play both of those games in 1080p with reasonable settings. I was playing Skyrim on Intel HD 4600 awhile back and as long as I kept it to 720p/noAA I could actually turn the settings up to maximum and maintain 40-50 FPS. Skyrim is an old game, sure, but Haswell was a long time ago, too. Intel 9th-Gen is the real deal.

        • CuttinHobo
        • 2 years ago

        I stand corrected. 🙂 That doesn’t sound too far off from what my GTX960 would likely achieve. Fallout 4 ran well enough with an HD texture pack, I haven’t tried WD2 yet.

          • _ppi
          • 2 years ago

          960 is far faster.

          I played Skyrim with HD textures on 5770 at 1080p at highish settings at some 30+ fps.

          • RAGEPRO
          • 2 years ago

          Well, nah. Even a 750 Ti is a fair upgrade from Intel HD 630. It’s just all about keeping expectations reasonable. The GPUs themselves are capable; the biggest problem is that shared memory bus.

          Also, looking into it, Watch Dogs 2 is way more demanding than I realized, so I dunno about playing it on HD 630. Iris could probably hack it. Fallout 4 is much heavier on CPU than GPU, though; I’m confident it’d run just perfectly fine on HD 630. If I had anything with HD 630 here I’d test it myself. 🙂

          When you’re trying to tweak games to run on IGPs it really helps a lot to have a basic understanding of graphics technology so you know what settings need which type of performance. Anti-aliasing is a good example; shader-based AA like FXAA isn’t demanding at all, while trying to do multi-sampling (or god forbid, super-sampling) on an IGP is asking for pain. (More accurately, FXAA is demanding on a resource that Intel’s IGPs have to spare because the bottlenecks are elsewhere.) Anisotropic Filtering is no big deal on a discrete GPU, but turning it down or off can have real performance impact on an IGP.

          In general, fillrate and bandwidth requirements—both of which are at a premium for IGPs—scale in a quadratic fashion with resolution, so a relatively small change in resolution (say, 1600×900 -> 1920×1080) actually means a great deal more work (almost 50% more, in this case). Turning down resolution below 1080 height (or even 1440 height, on current Pascal/Polaris hardware) rarely helps much on discrete GPUs, but stepping down a notch can REALLY help things on an IGP. It’s real ugly at first, but I promise you get used to it.

          It’s also worth noting I think that some settings, particularly shadows, particles, and physics, have little impact on graphics performance because again, the bottlenecks are elsewhere. As a result, they can be left fully enabled. Stuff like shader quality, effects quality, and so on can usually be left on “high” with little or no impact on framerate. Likewise, you can usually turn textures up to medium or high with little performance impact thanks to a tech called DVMT that allows the IGP to borrow up to 2GB of main memory to use as “dedicated” graphics memory on top of whatever you already have set aside for it.

          [url=https://gameplay.intel.com/<]Intel actually has a site set up to help people configure game settings for its graphics parts[/url<], although sadly neither WD2 nor Fallout 4 are on it. The settings are also sometimes a bit questionable IMO; for HD 630 in Overwatch they recommend 720p "Low". I'm pretty sure this is gross overkill, as "Low" in Overwatch defaults to a 75% resolution scalar; that means the game is actually rendering at something like 1152x600 or so. Overwatch is not THAT demanding of a game, so I really doubt it needs to be turned down that low. Perhaps the best example of good optimization for playing on IGP are the recommended settings for Skyrim on an HD 630: 1920x1080 with FXAA, "High" textures, and a mix of low, medium, and high settings otherwise. That's pretty close to what I used on my old Haswell IGP, although the framerate was pretty jank; I imagine their settings are aiming for a smoother framerate. Anyway, this kinda became more long-winded than I really wanted. The key take-away is that the days of Intel graphics being resigned solely to office machines are long past. Intel's graphics chips, even the Iris ones, don't really measure up to discrete solutions, particularly not in desktops. However, they're really only lacking in fillrate and bandwidth; they're full-featured graphics chips in a tiny little sub-10W power budget. If you compromise a bit on resolution (including anti-aliasing) you can get them to run almost anything pretty well. Alternatively you could spend a hundred bucks on an RX 560 and enjoy a no-compromises experience, at least in 1080p. 🙂

            • Demetri
            • 2 years ago

            They may not have taken the default scaling into account, but I definitely wouldn’t recommend Overwatch on anything but Low on IGP. It seems to have issues with mouse latency on low end hardware. On RX 460 @ 1080P 75hz, I have to drop it to low (with scaling adjusted back up to 100%) to avoid any mouse lag problems, even though it can generally hold 75fps on medium. No such issues with RX 470.

            • CuttinHobo
            • 2 years ago

            You should consider writing for a tech site…. 😀

            It would be interesting if the HD630 showed up in a future TR round-up as a reference point. 🙂

            • RAGEPRO
            • 2 years ago

            The “IGP game performance scales weirdly in comparison to discrete GPUs” issue I was talking about in my last post make comparing IGPs to discrete GPUs, even low-end ones, really complicated. If you compare them straight up the IGPs look completely useless, which is I think why many people have that perception.

            I’d love to do an article reviewing and comparing IGPs and low-end GPUs but I simply don’t have the hardware, heh.

            • CuttinHobo
            • 2 years ago

            I see how that would make it very difficult to create an apples-to-apples comparison against discrete GPUs. I know I’d be interested in reading a TR IGP comparison (hello, Raven Ridge!) but since it would be purely academic for me, I won’t push any harder than “hey that would be a cool read.” 🙂

        • tipoo
        • 2 years ago

        I don’t know about the HD630 doing Fallout 4 at 1080p; my Iris Pro 5200 had it at 720p to be passable.

        That’s not because the GPUs are bad though, it’s just typical Bethesda performance.

          • DavidC1
          • 2 years ago

          tipoo: Iris Pro 5200 was decent for an iGPU back in its heyday but its old. HD 630 pretty much performs the same. 3DMark11 represents iGPU gaming performance pretty well, the 5200 is 10-15% better there.

          The catch is 3DMark represents more or less an ideal scenario. Iris Pro 5200 had trouble scaling on higher settings and resolutions. The Gen 9 GPU with Skylake fixes that. Gaming performance reflects this. There are scenarios where the eDRAM makes a difference but most of the games they trade blows.

          Interestingly they can’t make a high end iGPU that does much better. The Iris Pro 580 pretty much sucks. I think it made sense they don’t make it. It was barely 20% better than the predeceding Iris Pro 6200.

    • DPete27
    • 2 years ago

    Still no word on Intel FreeSync support 🙁

      • Airmantharp
      • 2 years ago

      Wouldn’t that require new silicon?

      We might be waiting until after Coffee Lake for it, even if it’s been on their board since it came out. They may want to get HDMI Variable Refresh Rate in there too.

      • psuedonymous
      • 2 years ago

      Probably not until DP Adaptive Sync is a mandatory part of the DP spec rather than an optional extension.
      Currently, implementations of DP Adaptive Sync are singing to AMD’s Freesync tune rather than to the VESA group, and nobody really wants to be beholden to another vendor’s implementation if there is a non-vendor-specific option on the horizon.

      • DavidC1
      • 2 years ago

      Cannonlake seems like a good candiate. There’s supposed to be some big changes on the implementation. Leaks indicate 40EUs for GT2. If they scale everything then hopefully we’ll see nearly 2x the performance. It’s been a long while since we saw 2x on anything. Heck, we get mere 20-30% gains on GPUs nowadays! That’s nothing. 20-30% gain is an excellent gain on a CPU, not on a GPU. GPUs used to be 2x. Now we get Pascal/Polaris/Vega with identical performance at the same clock as the predecessors.

Pin It on Pinterest

Share This