Intel gets gamers up to speed with optimal IGP settings

Intel's latest integrated graphics processors might not max out recent games at 4K, but they're plenty serviceable. To help owners best take advantage of the graphics integrated into their CPUs, the company has set up a new gaming portal that recommends playable games and provides optimal graphics settings for different chips. Here are the recommended settings for Borderlands 2 on my Core i5-4690K's HD Graphics 4600 IGP—at a resolution of 1366×768:

Providing gamers with a series of screenshots isn't the most sophisticated way to share this information, but it's nice that Intel is taking some of the guesswork out of tuning in-game settings. It's also convenient to get a list of games that will run on a given IGP, though it's not clear what standards are being used for a thumbs-up. For notebook users and those whose starting point for PC gaming is an Intel IGP, this site could be a helpful resource.

Comments closed
    • Meadows
    • 4 years ago

    Lipstick on a pig.

    • HisDivineOrder
    • 4 years ago

    This looks like one of those efforts begun with the best intentions and that’ll end as their interest wanes and new products begin sliding out the gate one after the other. It’ll end not because someone somewhere told them to stop, but because someone somewhere got too busy to keep it up.

    • albundy
    • 4 years ago

    finally! now i can play with notepad!

    • willmore
    • 4 years ago

    Intel chips have what can best be described as Business Graphics. Yes, it’s gonna be a slide show.

      • auxy
      • 4 years ago

      [url=https://techreport.com/forums/viewtopic.php?f=3&t=106680<]Maybe I can convince you to reconsider your views on Intel HD Graphics.[/url<]

        • willmore
        • 4 years ago

        Until I read your post, you might have. But you show the modern intel CPU getting stomped by a 9 year old board. To make the Intel GPU playable, you had to cut backt he details and resolution in all but the most trivial of games.

          • auxy
          • 4 years ago

          If you read a little more carefully, it’s mostly just resolution. As I pointed out in another post, the things lacking are mostly raw fillrate and bandwidth — the chips are pretty capable as long as your requirements on these two things are moderate. Case in point, Firefall and Warframe on medium-high settings, and DmC on “Ultra”! (´▽`)

    • kamikaziechameleon
    • 4 years ago

    Here’s the thing. The darn promotion of intel integrated on this site is very misleading. You guys celebrate the fact that intel works. That should be a given. Do their offerings work well? Not really, not in a way that makes one praise them. The thing is that most of these offerings are on premium systems and so the price to performance ratio is not in their favor.

    • kamikaziechameleon
    • 4 years ago

    When you compare what a new IGP does next to a 3 year old GPU… Its just like WTF! Seriously though Intel graphics are barely as good as my phones. They seem more apt to decode video than anything else. Definately not great at anything else actually.

      • Andrew Lauritzen
      • 4 years ago

      Let’s try something fun. Without checking any benchmarks, rank these in the order you think they perform (say in 3dmark for some sort of semi-comparable number):

      Surface Pro 3 i3 (HD 4200)
      Surface Pro 3 i5 (HD 4400)
      Surface Pro (HD 4000)
      Dell Venue 8 Pro (HD Graphics)

      iPhone 5s
      iPhone 6 Plus
      iPad Air 2

      NVIDIA Shield Tablet (Tegra K1)
      NVIDIA Shield (Tegra 4)
      Nexus 9 (Tegra K1)

      Samsung Galaxy S6 (Mali T760)
      OnePlus One (Adreno 330)

      Obviously these are varying form factors and TDPs, but given your assertions I’m curious how close your perception is to reality 🙂

    • UnfriendlyFire
    • 4 years ago

    Anisotropic filtering: Lulwut
    Bullet Decals: Not necessary!
    Folliage Distance: Who needs those anywayys?
    Texture Quality: *Puke*
    Game Detail: As long as you can tell the difference between a gun and a person, you’re good!
    Ambient Occlusion: LOL
    Depth of Field: Nope
    FXAA: Crank it up to max to replace MSAA! Who cares about blurry text!
    Physx Effects: Ahahaha…
    Texture Fade: Nah

    1080p resolution? Hope you’re good at playing at 10 FPS, because you’re in for a disappointment. Or cough up and buy a +$400 CPU with Iris Pro graphics (when a Pentium Anniversary Edition + discrete graphics or a laptop with dedicated graphics is more cost efficient)

    *Forget about playing any games if the laptop has a soldered single-channel 1333 mhz RAM

    (FPS drops from a stable 60 to 5-30 when my laptop randomly switches from Radoen 8750M to the i7 4500u’s IGP in the middle of playing TF2)

      • jessterman21
      • 4 years ago

      Unfriendly shots fired!

        • UnfriendlyFire
        • 4 years ago

        *Silently mutters something about puns*

      • nanoflower
      • 4 years ago

      Agreed. I’m currently using a Pentium G3258 and a 650TI and have played through Borderlands 2 and the Pre-Sequel with no major issues with most of the graphics features turned up. I do see a few times when my CPU usage hits 100% but it’s only for a few seconds and then it drops back down to 50-70%.

    • UberGerbil
    • 4 years ago

    [quote<]Intel® Core™ i5-2500K Processor (6M Cache, up to 3.70 GHz) Intel® HD Graphics 3000 No game settings are available. Find games playable on earlier Intel graphics. For more information visit the FAQ [/quote<]

      • chuckula
      • 4 years ago

      It’s a sad sad day when we can’t play solitaire on Intel graphics.

      • DrDominodog51
      • 4 years ago

      It only seems to list Ivy Bridge and above. My i5-2410M doesn’t have anything listed for it also. I found [url=http://www.intel.com/support/graphics/intelhdgraphics3000_2000/sb/CS-032052.htm?wapkw=intel+hd+3000<]this page[/url<] for HD 3000 though. It doesn't list settings.

        • Chuckaluphagus
        • 4 years ago

        My notebook has the i5-2410M as well, and I’ve recently been realizing that it’s a lot more capable than I had previously thought. It’s never going to run The Witcher 3 (or even 1, most likely), but it wasn’t having any trouble with [url=http://www.invisibleincgame.com/<]Invisible, Inc.[/url<] earlier this week, with only a few settings turned down. Half-Life 2 and Portal are even surprisingly playable. I played through some of Mass Effect with it as well. Keeping the drivers up to date has provided huge gains over its original performance. It's never going to be high-end -- it might not even reach the level of low-end -- but get the latest drivers and see what you can pull off.

          • DrDominodog51
          • 4 years ago

          How much ram is in your notebook? Mine has 4 GB, so 348 MB for the iGPU. I found that at 1024×768 the iGPU manages to max Portal with >30 fps. It handles Borderlands 2 at low setting (1024×768) like a champ also. About the drivers (cough)Macbook Pro(cough)… In any case, the worst game I throw at it is Borderlands 2. And with that throw, HD 3000 graphics hits me right in the head with the ball and burns my lap.

            • Chuckaluphagus
            • 4 years ago

            8 GB DDR3-1333 (that’s the speed it’s locked at on the motherboard, no option to change it since it’s a Thinkpad).

            I’m running Linux, so the Intel driver situation is pretty great – I can get updates direct from Intel as soon as they’re available. I know that the wait has traditionally been terrible if you’re running Windows. For Macs, isn’t the driver version linked to the OS X release you’re running?

            Also, earlier this year I opened up the laptop, replaced the thermal compound that had been applied at the factory 4 years ago, and blew all the dust bunnies out of the heat sink+fan assembly. It has made a world of difference in performance, since the CPU (and therefore the integrated graphics) stay much, much cooler, and I don’t run into throttling. I cannot recommend that enough, if heat is a problem for you.

            • DrDominodog51
            • 4 years ago

            I believe it is the release and the specific version of the release as well for driver version. The problem isn’t the heat for the cpu and iGPU. My lap getting burned by the metal surrounding the cpu is the issue.

            • Deanjo
            • 4 years ago

            [quote<]My lap getting burned by the metal surrounding the cpu is the issue.[/quote<] If it is a Mac then RTFM! Apple does not make "laptops". Right in the manual, they say: [quote<]To operate the computer safely and reduce the possibility of heat-related injuries, follow these guidelines: [b<]- Set up your MacBook Pro on a stable work surface that allows for adequate air circulation under and around the computer.[/b<][/quote<] No wonder why gruesome yellow warning stickers are needed on everything now days. People don't read!

            • DrDominodog51
            • 4 years ago

            You read instructions? I just look at the object then use it. As for the warning stickers, I just throw those away.

            • Deanjo
            • 4 years ago

            I suppose stop signs and red lights are just a suggestion as well.

            • DrDominodog51
            • 4 years ago

            Oh… So that’s what the red signs I speed by say….

            • Andrew Lauritzen
            • 4 years ago

            > Mine has 4 GB, so 348 MB for the iGPU

            Are you talking about the aperture setting in the BIOS? That’s actually only peripherally related to the the iGPUs ability to use memory. In general if you’re on Windows Vista+ the operating system will allow you to use up to ~45% or so of your physical DRAM for graphics, so in your case ~1.7GB.

            • DrDominodog51
            • 4 years ago

            I’m on OS X so literally the os will only use up to 348 MB while I have only 4 GB of ram installed.

            • Andrew Lauritzen
            • 4 years ago

            Hmm yeah I’m not entirely sure how OSX works on that front, but I’d be sort of surprised if that was the case… I mean you couldn’t even really run a modern web browser with that sort of limitation on graphics memory, let alone any game.

            • Deanjo
            • 4 years ago

            [quote<]I mean you couldn't even really run a modern web browser with that sort of limitation on graphics memory[/quote<] Oh please....... you can't be serious.

            • Andrew Lauritzen
            • 4 years ago

            Check out the allocation lists in GPUView. It takes me about ~3-4 tabs in Chrome (fewer with media/flash) to get into that range. Add a few other applications and their swap chains and it doesn’t take much. WDDM 1.x will obviously move memory around and split command buffers if need be, but a system with only a couple hundred MB of memory available for graphics would not run well.

            Why is that so unbelievable to you? Would you expect these applications to run with 512MB of RAM? A good chunk of the resident data of modern desktop applications counts as “graphics memory” these days – basically anything that’s going to get draw/composited.

            Ultimately I find it hard to believe Apple would actually limit total allocated graphics memory to something so small for no real reason. It’s far more likely that someone is misinterpreting what some counter or graph means, which is incredibly common with memory stuff in general (ex. people reading task manager and not understanding committed, cached, paged, non-paged, etc. let alone people constantly misinterpreting “dedicated” vs “shared” graphics memory).

            • Deanjo
            • 4 years ago

            [quote<]Why is that so unbelievable to you? [/quote<] Because [u<]millions[/u<] (perhaps even billions) of systems out there operate modern browsers just fine on less that that for video memory and with plenty of tabs open.

            • Andrew Lauritzen
            • 4 years ago

            Are you talking about dGPUs with <256MB of VRAM? Remember, on Windows at least they *also* allocate graphics memory out of system memory (same as integrated, up to ~45%) even on a dGPU.

            • Deanjo
            • 4 years ago

            No, I’m talking about the millions of systems in use in the business world that often have 64 – 256 Meg allocated to VRAM. Not to mention the millions of devices out there with less than 1 GB of total ram (including the video allotment).

            It must be a miracle that my Mac Mini 2006 with a crappy 64 Meg allocated to video (total of 2 Gig system ram) on the GMA 950 can have 30+ tabs open in Firefox and even function running @ 1920×1200.

            • Andrew Lauritzen
            • 4 years ago

            > No, I’m talking about the millions of systems in use in the business world that often have 64 – 256 Meg allocated to VRAM

            Right, on what OS do these systems “allocate 64-256 MB to VRAM”? This is really what I’m getting at, because this is not how Vista+ works and it doesn’t sound like that’s how OSX works from the link I posed either (although please point me to better docs if they exist). It’s not statically allocated VRAM or whatever the notion is.

            Regarding systems with less than 1GB of physical DRAM… do those really run modern desktop browsers (Chrome, etc) with many tabs acceptably to you? Perhaps my definition of acceptable is just different 🙂

            • DrDominodog51
            • 4 years ago

            [quote<]Right, on what OS do these systems "allocate 64-256 MB to VRAM"? This is really what I'm getting at, because this is not how Vista+ works and it doesn't sound like that's how OSX works from the link I posed either (although please point me to better docs if they exist). It's not statically allocated VRAM or whatever the notion is.[/quote<] Have you ever ran command line only?

            • Andrew Lauritzen
            • 4 years ago

            Haha. I dunno about you but I just wget all my web pages and read the HTML. And… uhh… manually execute the JS in my mind 😉

            • DrDominodog51
            • 4 years ago

            [url<]https://techreport.com/forums/viewtopic.php?f=1&t=42526&p=1260851#p1260851[/url<]

            • Andrew Lauritzen
            • 4 years ago

            Oh hell yeah I remember Lynx! Stupid spoiled kids these days and their… uhh… images.

            • Deanjo
            • 4 years ago

            [quote<]Regarding systems with less than 1GB of physical DRAM... do those really run modern desktop browsers (Chrome, etc) with many tabs acceptably to you? Perhaps my definition of acceptable is just different :)[/quote<] Yup. Those 1 Gig Meego chips run modern browsers just fine (even on Windows 8). Chrome itself however is just a plain old regular ram hog.

            • Andrew Lauritzen
            • 4 years ago

            Hmm well I’ll give you that Chrome is a huge hog. I’ll have to give it a try in IE at some point. Obviously mobile devices/OSes work fine but they have very different browsers despite similar names 🙂

            • Deanjo
            • 4 years ago

            The Meego stick uses full blown Windows and the exact same browsers. I would recommend using Firefox instead of IE on low memory devices (which also works fine on the Raspberry Pi, another low memory device that runs a full OS).

            On a Mac…… no bother to try anything else other than Safari.

            • Andrew Lauritzen
            • 4 years ago

            Found some info here:

            [url<]https://support.apple.com/en-us/HT204349[/url<] So yeah, anything on more modern versions of OSX shares memory similar to what I described. On some of the older systems (2011) they aren't totally clear about it. They say "[] allocates a base amount of memory to the integrated GPU based on how much system memory is installed", so that's where I assume you're getting the 384MB for 4GB. However you'll note that they say the same thing for discrete GPUs... The way I interpret that - particular the "base" part - is that they pin some amount of memory statically to the GPU, but can likely still allocate more shared memory dynamically. This isn't uncommon for older OSes and chips, even on Windows. But it doesn't mean that that's *all* the memory it can use - like I said, that would be fairly absurd and pointless. If anyone has more docs on graphics memory management in OSX I'd be curious to read them, but typically Apple doesn't talk a ton about that stuff. Would be easily enough to test though on a system.

            • Klimax
            • 4 years ago

            Only if necessary. IE 11 on Windows 8.1 will use generally far less. 18MB/213MB in total for all apps. (IGP)
            On main system where I have currently over 100 tabs and other load of programs running including two instances of VS 2013 I got 1,5GB of VRAM. (Titan)

            System will not just waste memory. Don’t infer system behavior from Chrome. Terminally bad applications coded by WTF team are generally bad for that.

            • Andrew Lauritzen
            • 4 years ago

            Yeah it definitely does depend on how much is available (as with regular DRAM), and fair point on Chrome. I’ve just had bad experiences on low memory systems, although I’ll admit that I have only recently switched away from Chrome (for this and other reasons).

            • Klimax
            • 4 years ago

            Ups. I think I may have mislead you. First case with IE11 was on system with 8GB of RAM and using IGP, the other system has 16GB.(Total was for all apps, not total RAM)

    • nanoflower
    • 4 years ago

    Just don’t try to use this if you have a Pentium G3258. It’s not supported as I found out when I tried to use the site earlier today.

    • willmore
    • 4 years ago

    Set all settings to lowest values. Done.

      • bittermann
      • 4 years ago

      That’s kind of what I got out of the pic…turn all graphics eye candy to off!

      • Voldenuit
      • 4 years ago

      [quote<]Set all settings to lowest values. Done.[/quote<] Also, stick to games that run on DOSBox. On a serious note, that's where all the good games live anyway (fires up Master of Magic, Fallout 2 and X-COM(1994)).

        • JustAnEngineer
        • 4 years ago

        The XCOM remake is worthy successor to the original. Give it a try.
        [url<]http://store.steampowered.com/sub/37429/[/url<]

      • derFunkenstein
      • 4 years ago

      Yet surprisingly for HD4600 they recommend 1080p and “Medium” details in StarCraft II. I wonder what they consider playable.

        • DrDominodog51
        • 4 years ago

        They say 30 fps.

        • auxy
        • 4 years ago

        I don’t doubt that this is quite playable in SCII! SCII is going to be limited much more by geometry performance (read: CPU) than anything else.

        • tipoo
        • 4 years ago

        30fps, and I believe that completely, I played Heart of the Swarm on a Radeon HD 4570 which is probably worse than the Intel HD 4600. Worked fine on 720p mid-low. If you look here, the HD 4600 is about double the performance of the Radeon 4570.

        [url<]http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html[/url<]

      • Chrispy_
      • 4 years ago

      And if you still don’t get framerates that you can use, make an 800×450 custom resolution and squint.

      • auxy
      • 4 years ago

      [url=https://techreport.com/forums/viewtopic.php?f=3&t=106680<]No?[/url<] Σ(゚Д゚ ) You can actually turn many detail settings up pretty high on Intel graphics; it's not shader power they're lacking, it's raw pixel fillrate and bandwidth. As long as you keep the resolution lowish (HD or so), they can handle medium-high settings in a lot of games.

        • Nevermind
        • 4 years ago

        8x AA in 640×480… I’ll stick to etch a sketch.

          • auxy
          • 4 years ago

          Why would you enable anti-aliasing on a card limited by pixel fill …? (ノー`)

            • Andrew Lauritzen
            • 4 years ago

            I’ll add that MSAA is particularly expensive on current Intel chips so we highly recommend using post-process AA stuff in modern games (I believe there is even some control panel override version) as it offers a much better performance/quality tradeoff.

            • tipoo
            • 4 years ago

            Yeah, I force override with Intel CMAA. It’s a bit better than FXAA in a few ways, and about the same (negligible) perf hit on my Iris Pro 5200.

      • fhohj
      • 4 years ago

      you all just wait. at some point Intel is going to release a beefy gpu.

      Intel GPU

      The jokes will stop.

        • tipoo
        • 4 years ago

        That would be pretty cool, actually. Their Gen 7.5 graphics are fine enough for their power draws, gen 8 with Skylake will be better…Multiply the number of EUs, as well as the front end to keep it fed, provide enough bandwidth, they could make some competitive GPUs.

        It would take some convincing people about Intel drivers, but they’re really ok now, but the more important thing is having a third competitor on the discreet graphics front. That would be awesome, goodbye duopoly. Give it the old college try, Intel.

        Do I think they will? No, I think their dedicated card interest ended with Larrabee. But it would be interesting, nonetheless.

    • tipoo
    • 4 years ago

    So not the most high tech solution in the world, not like Nvidias computer thingy that calculates the best settings for you, but it’s something. Good to know what will and won’t run on them too, my Iris Pro has run most modern AAA games thrown at it, but the odd one out will be unplayable on it (ie Far Cry 4).

    What are they defining as playable though? I’m assuming at least having locked 30fps possible? But they have FC4 on there, I couldn’t get that up to 30 on minimum.

    I still wonder if Intel will ever decide to put the framebuffer in the eDRAM, it’s confusing why it’s not there. They say 128MB gives a lot of headroom, why not give a bit to the framebuffer like consoles with eDRAM/eSRAM do?
    Also how different is the 128MB cache from how low end laptop chips from AMD/Nvidia used to do it,with a video memory plus hypermemory or turbocache to access main system memory as well?

      • Thresher
      • 4 years ago

      NVidia’s solution isn’t actually calculated so much as adjusted. They keep a database of different types of hardware combinations, then give you a recommendation. It might be tweaked a little bit locally, but almost all of it is really just a database query.

      • Andrew Lauritzen
      • 4 years ago

      > I still wonder if Intel will ever decide to put the framebuffer in the eDRAM, it’s confusing why it’s not there. They say 128MB gives a lot of headroom, why not give a bit to the framebuffer like consoles with eDRAM/eSRAM do?

      What exactly is the question here? Pretty much all memory is backed by the EDRAM, including framebuffers, textures, vertices, etc. Even data passed from the CPU to the GPU (constants, dynamic textures, etc) can often stay entirely on-chip in the EDRAM.

      It’s all completely automatic, although some controls are available to set the cachability of various resources/pages. After quite a lot of testing though, we generally leave everything as cached as the hardware invariably does a better job than trying to statically guess at it in software, despite what game devs may tell you 🙂

        • Nevermind
        • 4 years ago

        Nah, guess smarter.

          • Andrew Lauritzen
          • 4 years ago

          This is something that is extremely easy to exhaustively test, and I’ve personally done it for several games. Even the “optimal” cache settings for a given frame aren’t very far away from what “cache everything” gets and any static decision will lose to standard cache policies over many frames as working sets shift. There’s no way to know a priori what the best policy is without doing something extremely SKU-dependent. As is usual with this sort of software cache optimization, you may at best get a few % on a given SKU, but in exchange you’ve de-optimized it on any other/future SKUs.

          This is actually blatantly obvious on the consoles – a 32MB cache would provably do a far better job of bandwidth amplification than does the scratch pad, even with applications specifically optimizing for it. It’s not like this is news either… there’s a reason caches continue to exist no matter how many times an architecture tries to claim that you can just do it in software – for all but the most trivial streaming algorithms you need cache-line level metadata about accesses and so on to be efficient.

        • tipoo
        • 4 years ago

        Hm, back when it was out supposedly someone at Intel told Anandtech they specifically excluded the framebuffer… Did you guys change it after launch? Or was this just wrong?

        [url<]http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/3[/url<] "The Crystalwell enabled graphics driver can choose to keep certain things out of the eDRAM. The frame buffer isn’t stored in eDRAM for example." They did also note though, that the 32MB eSRAM in the XBO would have been better as a cache than as the current developer controlled scratchpad, as you said. Part of the developer headache on XBO. Also, can you elaborate what's considered "playable" in the above context?

          • Andrew Lauritzen
          • 4 years ago

          They may be referring to swap chain buffers in full screen exclusive mode – on certain configurations those are write-through cached (not exactly the same thing as “not stored in EDRAM”, but yeah). Note though that modern engines typically only resolve to the swap chain as the very last step (tone mapping or similar), so it isn’t the source of much bandwidth use/reuse as all of the rendering and assembly of the frame done in offscreen buffers.

          Not sure about your “playable” question. I think a posted below noted that it says 30fps somewhere if that’s what you’re asking.

            • tipoo
            • 4 years ago

            Cool, figured it was 30FPS, just wanted to make sure. Although I could never get FC4 up there consistently while the website has it up there, for the 4770HQ.

            Is the EDRAM the same in Broadwell as it is in Haswell in terms of latency/bandwidth? Are we allowed to know yet if Skylake would have a change there?

            • Andrew Lauritzen
            • 4 years ago

            The EDRAM setup is mostly the same on Broadwell as Haswell. There’s actually still a decent chunk of “extra” bandwidth to EDRAM that we don’t really need right now (i.e. you can scale up the speed of the GPU and don’t need to scale EDRAM for a bit), but suffice it to say we’ll increase it as needed. As I noted in another post, this sort of thing is actually pretty easy to test and optimize for.

            Unfortunately I can’t spill the beans on anything Skylake yet, but stay tuned 🙂 It’s not too far off now.

        • auxy
        • 4 years ago

        I like watching these kids reply to you without knowing who you are. ( *´艸`)

          • Andrew Lauritzen
          • 4 years ago

          I even changed my name and everything 🙂

          • tipoo
          • 4 years ago

          I knew, hence the Intel specific elaboration requests ¯\_(ツ)_/¯

      • gamerk2
      • 4 years ago

      EDRAM/ESRAM is both expensive to produce, and REALLY hard to manage properly. You basically would need game developers to code specifically for it.

      Also remember, embedded RAM only helps avoid RAM related bottlenecks; it doesn’t cover up weak shader performance.

      So in short: Not worth the cost.

        • Andrew Lauritzen
        • 4 years ago

        > You basically would need game developers to code specifically for it.

        As I explained above, this is not the case for Intel’s EDRAM implementation. It acts as a huge cache and does not really require any specific optimization beyond regular cache-friendly behavior (which benefits even SKUs without the EDRAM).

    • chuckula
    • 4 years ago

    Setting 1: Drop to text terminal.
    Setting 2: Start Zork.
    Setting 3: Be eaten by grue due to insufficient lighting from IGP.

Pin It on Pinterest

Share This