Our picks for the best gaming CPUs of late 2018

Intel’s recent launch of the Core i9-9900K has a lot of people asking what it means to have the best gaming CPU around. The answer to that question is complicated, and we wanted to take a break from our usual System Guide format to dive deep into this specific topic.

Intel’s Core i9-9900K

Let’s start from square one. For one’s CPU choice to influence gaming performance at all, a game has to be bottlenecked by the CPU to begin with. That may sound obvious, but not every game is CPU-bound. In fact, we’d guess that most aren’t. We have to dig deep into our Steam libraries for titles that care much about the CPU they run on when we’re getting set up for CPU reviews.

Older games like Grand Theft Auto V form the foundation of any CPU gaming benchmarking suite. GTA V is typical of many older titles in that it cares a lot about a CPU’s single-threaded performance. You can easily find modern games that strongly care about the performance of a single thread, however, as Far Cry 5 seems to.

Developers’ use of next-generation APIs in newer titles isn’t a reliable predictor of CPU usage patterns, though. Gears of War 4‘s implementation of the DirectX 12 API is as dependent on a single core as ever when frame rates start to climb. Sections of Rise of the Tomb Raider can hammer every core at high enough frame rates, but that only holds true with the game’s DirectX 12 renderer. RoTTR sequel Shadow of the Tomb Raider, on the other hand, doesn’t seem to stress CPUs much at all in DirectX 12 mode.

Crysis 3, a favorite of ours for exploring the performance potential of CPUs for gaming, can stress every single thread one can throw at it despite the fact that it predated DirectX 12. The same is true of Assassin’s Creed Origins and Assassin’s Creed Odyssey, both of which use DirectX 11 and still occupy every thread we can throw at them (although the gap between the lowest- and highest-performance CPUs in those titles often isn’t nearly as wide as it is with Crysis 3).

Even within the same game, API options can entirely change how a given title might stress a system. The Vulkan renderer in Doom‘s 2016 reboot doesn’t max out any single core, for example, but its OpenGL renderer proves a great test of per-core performance, if nothing else. (Run Doom with Vulkan if you can, please.)

Point is, there’s no easy way to make a general statement about how games will load a CPU short of hands-on testing and liberal use of Windows’ Resource Monitor. Even if we can’t predict how every game will behave, we know from experience that as frame rates climb and resolution, image quality, or both decrease, the CPU becomes more and more likely to be the bottleneck for a system’s gaming performance.

Going by experience, we also know games that tend to end up CPU-bound usually end up stuck on a single thread. It follows that a game’s performance ceiling can only be lifted by using a high-performance CPU with the highest possible single-core clock speed (or equivalent all-core overclock) one can throw at it. Titles that occupy the entire CPU stand to benefit from an all-core overclock, too. More speedy cores and threads lift all boats in those unusual cases.

To crown a gaming CPU as “the best,” then, we want high single-threaded performance out of the box. We also want the highest all-core clock speeds we can get to handle the odd game that’s embarrassingly parallel, and we want plenty of overclocking potential if we can get it.

How to choose a gaming PC’s CPU

With all that in mind, picking the “best gaming CPU” for a given build is going to depend on how you want to play games. Do you have a 4K monitor and a hunger for today’s latest titles with their settings cranked? In our experience, your CPU won’t have a noticeable impact on frame rates. 4K gaming puts an insane load on your graphics card, and frame rates will probably stay in relatively pedestrian ranges unless you’re doing something crazy like pairing GeForce RTX 2080 Tis in SLI.

For a 4K gaming PC, then, your CPU choice should depend on what else you do with your system. If you don’t need lots of cores and threads for your day-to-day work, you can save some money on the CPU and put that cash into a more capable graphics card. We wouldn’t take that approach to extremes, though: pairing a $100 CPU with a $1200 graphics card would be a bit silly.

CPU choice can matter some when gaming at 2560×1440, although the performance differences between chips aren’t going to be nearly as keenly felt as they would be at the extremes of 1920×1080 gaming unless you have a GeForce RTX 2080 Ti. As always, the more powerful your graphics card is and the higher you want to push frame rates, the more likely it is that you’ll notice a differences between CPUs when gaming at this resolution. Still, we’d allocate as much money to a powerful graphics card as possible for the best 2560×1440 gaming experience. Your CPU choice should still mostly depend on what else you want to do with your PC.

At the opposite extreme, do you play esports titles at the lowest possible resolution and highest possible frame rates? You’ll want to pick a CPU with fewer cores and threads and higher per-core clock speeds—possibly one that’s unlocked for easy overclocking. As you lower your game’s resolution or image quality settings to the minimum to increase frame rates, you may find that you ultimately hit a point where frame rates will go no higher, and higher CPU clock speeds via overclocking may be the only thing that will push that FPS counter up any more.

The above advice holds true for budget gamers running titles at 1920×1080 with a graphics card like a GTX 1050 Ti, GTX 1060 3 GB, or RX 570, even if those gamers aren’t pushing their systems quite to the same extremes. All of those graphics cards can be had for reasonable prices again these days, and they’re quite powerful compared to what modest budgets used to buy. Upgrading to the midrange GTX 1060 6 GB or RX 580 will only increase the likelihood of CPU bottlenecks at 1920×1080. To get the most out of those graphics cards, you can’t pick the slowest, cheapest processor available—you need a balanced system, and that might mean spending more than $100 on a CPU.

If you have a lower-resolution monitor, if you simply can’t afford more than a certain amount of money for a CPU, or if you only have the budget to upgrade your graphics card in an older system, not all is lost. You can shift some of a game’s load back onto the GPU by increasing image quality settings (anti-aliasing, especially), by using AMD’s Virtual Super Resolution or Nvidia’s Dynamic Super Resolution, or by enabling a combination of those things. The point is to exploit the untapped power of your graphics card to improve the gaming experience in ways unrelated to higher frame rates.

If higher frame rates are ultimately what you want in CPU-bound games—or your CPU is just so ancient that modern graphics cards are bottlenecking it, period—you’re ultimately going to have to shell out for more processing power. We can help.

 

The best gaming CPUs of late 2018

The best budget gaming CPU: Ryzen 3 2200G

Yes, that’s a Ryzen 5 2400G, sorry. Subtract two in your head

The Ryzen 3 2200G has four cores with a decent 3.7-GHz peak clock speed, and if you can find the extra money in your budget for a Hyper 212 Evo-class cooler, it can be overclocked on any B350 or B450 motherboard. Even this $100 Ryzen chip offers features that Intel has segmented out of its Pentium CPUs, like support for the AVX instruction set. That’s a nice performance bonus for apps that can make use of it. The included Wraith Stealth cooler isn’t half bad, either.

We think the 2200G offers a compelling feature set for a basic gaming system, even if you don’t intend to use the on-die Vega 8 IGP. TechSpot tested the entry-level Ryzen part with a GeForce GTX 1070 pushing pixels, and the site found the Ryzen chip a superior companion to that high-end graphics card than Intel’s Coffee Lake Pentium G5400 for CPU-bound games.

If you’re really, really tight on cash and just want to dip a toe into PC gaming for the first time, the 2200G has a relatively powerful integrated graphics processor that one can credibly game on, although you’ll need fast and potentially spendy DDR4 memory to really make the most of that capability. DDR4 prices have declined of late, though.

The best midrange gaming CPU: Core i5-8400

If you just want a solid gaming experience, aren’t cash-constrained, and don’t intend to do much heavy lifting with your PC, Intel’s Core i5-8400 should be your default pick for powering systems with a midrange discrete graphics card.

Despite its modest sticker price, the i5-8400 consistently turns in the best average frame rates and lowest 99th-percentile frame times of any CPU in its price class. It often embarrasses much more expensive chips in those measures, too. The only downside of this chip is its locked main multiplier. The i5-8400’s 4-GHz boost clock and 3.8-GHz all-core Turbo speeds are as good as they’ll ever get out of the box.

Shortages of Intel’s 14-nm processors have recently driven up the i5-8400’s price at retail, but we still think it’s the best midrange gaming processor around. If you can’t find stock of the i5-8400 at a reasonable price, though, read on.

The affordable, do-it-all alternative: Ryzen 5 2600X

If you want to do more with your PC than just game, or you can’t find Intel’s Core i5-8400 in stock, AMD’s Ryzen 5 2600X is our pick. This $230 chip boasts some of the highest single-core boost speeds available from a Ryzen part, plus multi-threaded prowess that Intel simply can’t match for the price. That means the 2600X can do things like gaming and software-encoded streaming all at once without dropping frames for your Twitch viewers. The Core i5-8400 just folds under similarly heavy workloads.

AMD includes a great stock cooler in the box with this chip, and the 2600X is compatible with a wide range of affordably-priced B350, B450, and X470 motherboards. The tradeoff at this price point is a slightly lower ceiling for frame rates and slightly higher 99th-percentile frame times than the Core i5-8400 delivers in CPU-bound titles at 1920×1080, so unless you’re certain you’ll make use of all 12 of those threads, we’d stick with the Coffee Lake CPU.

The high-refresh-rate addict’s attainable pick: Core i5-9600K

If you need high single-core clock speeds and unlocked multipliers to get the most out of lightly-threaded games, we’d take a look at Intel’s $280 Core i5-9600K. This chip offers a 4.6-GHz single-core clock speed, ranging down to a 4.3-GHz all-core Turbo Boost speed under load. For reference, 4.3 GHz was where the i5-8600K’s single-core Turbo speed peaked.

Intel’s ninth-generation Core CPUs re-introduced solder thermal interface material under the CPU heat spreader, and our experience with that change so far suggests that it makes those chips easier to cool than their predecessors when they’re overclocked. That should make it fairly easy to push the i5-9600K even further for titles that need it.

We haven’t personally tested an i5-9600K yet, but we know from other sites’ results that it edges out even its i5-8600K predecessor in most titles. We’re not entirely sold on six cores and six threads for a PC that’s going to be asked to do any heavy-duty work outside of gaming, but there’s a large gap between about $200 and $300 that only the i5-9600K fills for high-refresh-rate gaming fiends.

The best value for a high-end, do-it-all gaming CPU: Ryzen 7 2700X

If you’re trying to build a powerful but affordable system that can game and then some, AMD’s Ryzen 7 2700X is a superb pick. Its included Wraith Prism CPU cooler allows the chip to extract the vast majority of its considerable performance potential by way of its XFR 2 and Precision Boost 2 logic. On top of their higher suggested prices versus comparable Ryzen parts, you’ll spend quite a bit of cash on a cooler for any unlocked Intel CPU, especially if you intend to overclock. At its discounted $310 price of late, the 2700X just screams value, especially when paired with a B450 motherboard.

While you won’t get high-refresh-rate 1920×1080 gaming performance any better than that of the Core i5-8400 from the 2700X, this chip is a great foundation for 2560×1440 or 4K gaming, heavy-duty productivity workloads, or high-fidelity same-system streaming. Neither the Core i7-8700K nor the Core i7-9700K could handle streaming with OBS’ x264 “fast” preset in our most recent test, for example, while the Ryzen 7 2700X had no trouble delivering a smooth streaming and gaming experience with those high-fidelity visuals.

The gaming champion: Intel Core i7-9700K

If you want the highest possible frame rates from CPU-bound titles of all stripes without going overboard, Intel’s $385 Core i7-9700K is the final word (even though it sells for a slight premium at retail right now). The i7-9700K delivered average frame rates and 99th-percentile frame times on par with those of its more expensive Core i9 sibling in our recent tests, all for over $100 less than the i9-9900K’s list price. Need we say more?

We didn’t have much trouble overclocking our i7-9700K to 5 GHz on all of its cores, either, and that overclock heightened its already scorching gaming performance in tandem with our GeForce RTX 2080 Ti. Intel’s return to solder thermal interface material means the i7-9700K isn’t a bear to cool when it’s overclocked, either, unlike some older Coffee Lake parts.

Why not spring for the Core i9-9900K here? To be sure, the i9-9900K delivers unmatched all-around prowess in both gaming and productivity applications, but it’s also impossible to find for anywhere near its already-high $500 list price. Unless you absolutely need that unparalleled all-around capability, we think it’s best to put the extra $100 or more into other system components for a gaming PC.

For gamers’ specific needs, about the only thing the i7-9700K can’t do as well as that much more expensive chip is same-system streaming with higher x264 quality settings. You’ll need to upgrade to the Core i9-9900K if you’re a stream fiend who doesn’t want to sacrifice any performance. Since Intel’s highest-end ninth-gen CPU is so scarce and so overpriced at the moment, we find it hard to recommend to all but the most performance-crazed.

 

Should gamers shell out for high-end desktop CPUs?

Some gamers may have more money than sense, and they might look towards Intel and AMD’s high-end desktop platforms as a place to blow some of it. We’d advise against that move for a gaming system—it’s just wasted money these days outside of some specific circumstances. As we’ve already established, it’s the rare game whose performance scales linearly with core count, and if that were the only knock against high-end desktop processors for gamers, we’d still recommend against spending huge amounts on them.

As core counts have exploded of late, however, high-end desktop CPUs have employed novel on-die interconnects and on-package connections between dies that increase both inter-core and main memory latency, and they sometimes don’t clock anywhere near as high on a single core as their lower-end desktop counterparts do. That makes high-refresh-rate gaming on those platforms a dicey prospect, since you’ll often see lower frame rates and higher 99th-percentile frame times than you would with even the much less expensive Core i5-8400.

If you’re gaming at 2560×1440 or 4K, the limitations of high-end desktop CPUs will be less evident, but that’s true of any CPU at any price, not a reason to go and spend $1000 on a high-end desktop part. You really need to be doing something outside of gaming (or alongside gaming) that requires a combination of high core counts, gobs of memory bandwidth, and bountiful PCIe lanes from the CPU to justify stepping up to a high-end desktop platform. Gaming alone is not going to stress those resources.

For an extreme example of why some high-end desktop CPUs aren’t the best choice for gaming alone, Far Cry 5 doesn’t run right at all, at any resolution, on AMD’s uber-expensive Threadripper 2990WX. Sure, you can disable cores to get around this problem, but what’s the point of spending a ton of money on a ton of cores if you have to shut any of them off, especially if you were looking for multi-threaded oomph for heavy multitasking while gaming?

If you game, stream to multiple services, archive your footage with CPU encoding at 4K and high bitrates, you might have cause to look towards high-end desktop CPUs as foundations for your gaming system. Otherwise, we’d stick with AMD and Intel’s mainstream platforms.

What’s next

Intel recently announced plans to refresh its Skylake-X CPUs alongside its introduction of ninth-gen Core processors. Those refreshed chips will all have an allocation of 44 PCIe lanes from the CPU and soldered heat spreaders, plus potentially higher clocks than their predecessors. Those improvements will likely make those chips more appealing next to AMD’s Ryzen Threadripper CPUs, but we don’t expect them to mean much for gaming PCs.

Asus’ ROG Dominus Extreme motherboard for the Xeon W-3175X

At the extreme high end, Intel also plans to bring a version of its extreme-core-count Xeon Platinum CPUs to high-end desktops as the Xeon W-3175X. This chip’s server-class C612-based platform will require exotic power, cooling, and enclosures to house its massive motherboards and associated infrastructure. Unless you’re at the very limits of what’s possible from today’s high-end desktops and find yourself wanting more, we doubt the 3175X will have much relevance for folks interested in mere gaming.

Past that, we expect the CPU landscape to remain stable through the end of the year. Go forth and buy the chip that best fits your needs and budget, and above all, have fun with your new system.

Comments closed
    • elites2012
    • 11 months ago

    when intel makes a new chip from the ground up that is not part of the same LAKE family, then you can do a proper review. the i9 series is still part of the LAKE family. programmers are still righting games for the BRIDGE family. intel news a NEW chip and not an altered chip.

    • ronch
    • 12 months ago

    Going back in time and reminiscing all the processors (and other hardware) that you’ve owned makes you realize just how awesome today’s CPUs are. This is primarily the reason why I totally appreciate my FX while playing old games like Space Quest 3-5 and FMV titles like Gabriel Knight 2. Lots of new indie games don’t demand the latest hardware either so even 6-year old PCs work just fine unless you really cheaped out.

    • Luay
    • 12 months ago

    A bit of background; Thinking about getting the Samsung 4k 65″ Q6 TV the next black friday for PC gaming because it supports freesync. I suppose freesync will compensate the 20ms lag and my target of 45 fps, which would leave me with only the upcoming RX 590.

    My question is; What does everyone think would be the ideal CPU for this kind of build?

    I will most definitely turn down the graphics setting to keep the games running 30~45 fps @ 4k, and then dial down the resolution when that doesn’t work.

    Non-gaming apps most probably but not surely won’t be running in the background while playing.

    Thank you in advance!

      • K-L-Waster
      • 12 months ago

      I would think an 8400 or [s<]2700[/s<] 2600 / [s<]2700[/s<]2600X. (Edit to put in the AMD models I *meant* instead of what I *typed*.)

      • synthtel2
      • 12 months ago

      What kind of games do you want to play on it?

      The 2600 non-X seems like a good bet.

        • Luay
        • 12 months ago

        Thank you both for the suggestion. If the game is good I’ll play it. I have a back log of 3 years of games starting from Witcher 3 all the way to COD:BO4.

        I was waiting for HDMI 2.1 to splash the ca$h on a whole new theater PC (4k 120FPS HDR OLED, VRR, eARC, Atmos with ceiling speakers, the whole thing).

        Now I finished my home and HDMI 2.1 devices won’t start arriving till the 3rd Q of ’19. It’s a nice problem to have but a problem nonetheless.

        So a Polaris card with a Samsung Q6 seems like a decent setup until something truly worth getting arrives.

        Can I get away with a Ryzen 3 2200? It’s $100 cheaper and I am going to stress the GPU as much as I can. Forget about the benchmarks as they do paint a positive picture but they don’t help much in figuring out how the game will run in multiplayer games. I want to know if massive multiplayer games will stutter with the Ryzen 3

          • Ifalna
          • 12 months ago

          They will. You could buy the top of the line beast of a CPU and MMORPGs would still let your framerate tank in densely populated areas because they suck at multithreading.

          30-45ish FPS should be doable though, personally I would not want to play below 60FPS.
          I recently tried how my poor 1070 would cope with Final Fantasy XIV in 4K via Nvidias oversampling..

          BARF. I cannot stand playing with 35-40 FPS. Dear God how could I game like this back in the day?!

          As for the HDMI 2.1: CPU is irrelevant here, you need to wait for GPUs. IIRC Nvidias 20xx linup does not get HDMI 2.1, so not sure how long it will take for them to appear.

          You can build a nice and appropriately beefy system and add an HDMI 2.1 capable GPU later.
          Until then, you can always play at 1080p and let the TV do the scaling. That’s what I am contemplating as well, getting an 49″ or 55″ X900F and playing @ 1080p till my next GPU purchase in a few years.

          • synthtel2
          • 12 months ago

          Games like Witcher 3 and CoD shouldn’t pose any trouble at all for a 2200G. MMOs in the more usual sense are on average lightly-threaded and can definitely see some CPU-bound time beyond 22 ms, so that might be a good cause to go for an i3-8100 and fast-ish RAM. If things like Battlefield or Planetside may be involved or you just want to keep your options open, 2600 and fast RAM (though both of those games sound unplayable to me regardless on a setup with that kind of latency).

    • unclesharkey
    • 12 months ago

    The 8400 is currently over priced, even at Microcenter it costs 200 bucks.

      • elites2012
      • 11 months ago

      way way over priced. intel employees get the chip for $75. thats the cost of making it.

        • moose17145
        • 11 months ago

        The employee price has nothing to do with the manufacturing cost.

        The Employee price for ANY CPUs at intel is 50% of MSRP.

    • albundy
    • 12 months ago

    my r7 1700/16GB DDR43200/GTX 1070 aint goin’ nowhere for next decade. ok, maybe 10 years.

      • Shobai
      • 12 months ago

      Alright, but what’s that in deci-centuries?

      • Gastec
      • 12 months ago

      Decade, as in a dozen years?

      • elites2012
      • 11 months ago

      mine neither!

    • chill
    • 12 months ago

    First of all, absolutely fantastic article.

    Single point of contention: Your conclusions may be basically reversed for hyper-competitive FPS style gamers that tend to play everything at native resolution (usually 1080p) with every setting on lowest or even lower (with some console type tweaks to optimize gamma and make things as smooth/crisp/competition-focused as possible). I’m that guy! For me I suspect I might get the most value out of maxing out CPU/RAM and getting a mediocre GPU.

    Believe it or not, I could care less about graphics quality in all my gaming! Just want to get well over the 144hz refresh rate of my monitor so that I can maintain that as much as possible at all times and never be at a disadvantage in a gunfight. I’d be interested to see a few tests with competitive type games (PUBG / CSGO / COD 4) and badly optimized ones that seem to crush CPUs (7 Days to Die) to see where the “break even point” is for overkill CPU + GPU where they’re basically in balance neither limiting the other too much.

    Of course I realize that test would be pretty-much just for me, so I’ll keep dreaming and in the meantime plan on picking up that 9900k with a custom-loop and every last hint of voltage forced into it for extra frequency 😛

      • ptsant
      • 12 months ago

      Although I don’t game the way you do, I think it’s worth pointing out that some people think that way. I have at least one friend who sets everything at low(est) to never go below 90 fps.

      As for myself, I’m happy to have a 9-12ms ping to most Battlefield servers and I suppose this compensates for at least 1-2 frames vs other players.

      • Shobai
      • 12 months ago

      [quote<]Believe it or not, I could care less about graphics quality in all my gaming![/quote<] I choose not to believe you, thanks. And [url=https://www.google.com/url?q=https://m.youtube.com/watch%3Fv%3D8Gv0H-vPoDc&sa=U&ved=2ahUKEwjOx4WKoaveAhXLqo8KHYkrAowQyCkwAHoECAoQAQ&usg=AOvVaw3cIIBQvPBMMd--Nxv0nWoT<]Weird Al[/url<] would like a word =P

      • cegras
      • 12 months ago

      I’m the same! BF1 and OW on low quality honestly looks good enough. I think at low enough details, it’s ‘free’ to bump up texture and AF as a concession.

      • K-L-Waster
      • 12 months ago

      Don’t the serious PUBG players turn off details for a competitive advantage?

      i.e. turn the details down far enough that bushes and trees don’t render == removing 25% or more of the “cover” on a given map?

        • Ifalna
        • 12 months ago

        Hah, I remember doing that in a Far Cry LAN session. Man, the others were confused… and then we all turned the settings down and laughed about how ugly it looked. 😀

      • Gastec
      • 12 months ago

      That looks more like a job than gaming. I get it that you “game” like that with two,three cans of energy drinks (per hour) on your left, but most of us just want to [i<]relax[/i<] playing games with a beer or a cup of coffee on our right (or say the mouse side) meaning we take the hand [b<]OFF[/b<] the mouse for a few seconds. I know, I know...blasphemy! 🙂

    • ptsant
    • 12 months ago

    Maybe I’m too cheap, but what should I get for a friend at $100? I can’t really believe that there is nothing better than the 2200G. What about the Intel Pentium Gold or similar stuff?

      • derFunkenstein
      • 12 months ago

      I’d much prefer a 2200G over a Pentium Gold G5600, for example, because two real cores is too few cores these days for just about anything even remotely intensive. Games are going to be limited enough by the four real cores in the 2200G, let alone the dual-core Pentium. The fact it doesn’t support AVX is another sticking point. It’s not absolutely necessary yet, but it will be for certain tasks during the useable lifetime of that system.

        • JustAnEngineer
        • 12 months ago

        Ryzen 3 2200G bundled with a B450 or B350 motherboard goes for less than $160, delivered. Core i3-8100 bundled with a Z370 or B360 motherboard costs 50% more. You can get a Ryzen 5 2600 and B450 motherboard for less than that.

          • derFunkenstein
          • 12 months ago

          For ptsant’s BF1 usage, I’d probably go with the 2600 + B450 combo anyway.

        • ptsant
        • 12 months ago

        I see. I noticed that Battlefield 1 is very CPU intensive in 64-player mode online. I’m worried that the 2200G won’t be playable (at all!) in this setting. And I don’t even know what is going to happen with BF V, which is coming out in a a couple of weeks.

        Also, what about the Ryzen 5 1400? (3.2GHz vs 3.5GHz but 8t vs 4t) If trivially overclocked to 3.6+ GHz it should be faster than the 2200G.

          • Firestarter
          • 12 months ago

          multiplayer games with many players are often CPU bound, you can see this in games like Arma3 and Planetside 2 as well. The problem is that they’re CPU bound in situations that are (almost) impossible to benchmark

            • ptsant
            • 12 months ago

            I understand that it’s not practical, but it’s not unfeasible. Simply, it takes a long sampling period for a significant number of frame time samples to accumulate.

            One approach would be for 2 different players to play the same map on the same (full) server with different systems for at least 10 min (ideally, for a full match).

            Another approach would be for one player to gather frametime data over 1 week with at least 10h total playtime and then to change systems. It is assumed that 10h (or 20h?) playtime represent a decent mix of maps/servers/situations.

            I’m not going to claim that it is easy, but it would be definitely interesting to the millions of players who play BF, for example. One obvious comparison would be between a 4 core and a 6/8-core.

            • derFunkenstein
            • 12 months ago

            It would probably just take somebody with a 4-core/8-thread CPU checking out the CPU’s usage stats while playing a heavily-populated game. If all 8 threads are being maxed out, then my guess is that the Ryzen 5 1400 would be faster than the 2200G. Probably only a little bit faster, though, thanks to the clock speed deficit.

            • synthtel2
            • 12 months ago

            PS2’s CPU utilization keeps increasing as you throw more threads at it at least as far as 16T, but 8C16T (5.5T average utilization) isn’t really any faster than 4C8T (4T average utilization).

            • synthtel2
            • 12 months ago

            Were I trying to do this for Planetside, I’d probably have two players with different systems go stalker infil (can stay invisible indefinitely) and sit in some out-of-the-way corner at a big fight. With careful view alignment, the results should be pretty close.

            PS2 framerates are just too variable to get anything solid from a reasonable amount of normal play – a couple of times I’ve thought I fixed something major and then 5h of play later found a fight that bogged it back down almost as badly as before. There are also smaller-scale problems in that frustum culling matters a lot, so if for instance you’re attacking a biolab via a teleporter you may come up with a much slower average than the defenders of that biolab.

      • NovusBogus
      • 12 months ago

      None of the really cheap CPUs is a good deal due to the slowdown in obsolescence. A good 4-6 core today will be a decent to good 4-6 core in five years, but a bad 2-4 core will be an even worse 2-4 core. It’s not like the 2000s where you could buy last year’s hotness at a steep discount.

    • Ikepuska
    • 12 months ago

    Well, this article seems like a nice place to ask a question that’s been on my mind.

    I’ve been hoping to get ahold of a Dr. Zaber Sentry case for a Very SFF build. As a result I want to stay within a 65W TDP. So I’ve been wondering, given that particular constraint what would be the best perf/$ part I could get?

    I’ve been loking at the i5 8 series parts, which are last time I checked it was 220, 235, 250 for the 8400, 8500, 8600. Which would be the best deal there? I was thinking of using a H370 itx board, or even a B360 if the price difference is enough.

    I also don’t know the AMD situation that well, so what chip and chipset combination would be good, if I’m trying to stay within the 65w boundary?

    An additional question is on the ram. There’s tremendous variability in the prices there, but I want to know if it makes sense to allocate more money towards it?

      • sdch
      • 12 months ago
        • Ikepuska
        • 12 months ago

        Thanks for the reply.

        I’m looking at probably 1440 gaming or lighter 4k gaming in the living room. This will be paired with a GTX 1060 6GB to start and probably a 1070ti or 1080 once ebay prices for those come down a little more.

        Here’s a link to two sample builds I’m proposing. One of the biggest questions I’ve got is how much benefit is there to paying more for ram. There’s 2400 1x16GB kits you can buy for as low as 115 right now.

        The two proposed builds are:
        [url<]https://pcpartpicker.com/list/VVRnMZ[/url<] and [url<]https://pcpartpicker.com/list/4bxTV6[/url<] The Sentry is a VERY VERY small case, around 7L, with a maximum vertical clearance for coolers of just 48mm, so thermal capacity is pretty limited. ETA: I'm reusing the PSU from an older ITX build, but it was a Z97 board, so pretty much nothing else is reusable, but my overall budget is trying to stay under ~600 without the GPU. I'm just wondering if it makes more sense to go 8400, pay the extra 50 for 8600 or go to 2600. The selection of good MBs in the ITX size for Ryzen seems slightly more expensive than I expected so the total platform costs aren't as big a difference as they seemed at first, so some second opinions on that would be helpful.

          • sdch
          • 12 months ago
          • derFunkenstein
          • 12 months ago

          I’d go to the 2600 because the CPU cooler that comes with it should fit the case and be much better than the Intel solution. I wouldn’t want to use the stock Intel cooler in something that tight. CPU performance between the 2600 and the 8400 is pretty close, though.

            • sdch
            • 12 months ago
            • derFunkenstein
            • 12 months ago

            wow, guess it’s shorter than I thought, if the wraith stealth won’t fit.

        • jihadjoe
        • 12 months ago

        Desktop Ryzen has no integrated video though, so you’ll have to get a discrete GPU which will take space and add a bit more to your TDP.

        The 2400G might be a better fit for a very SFF build.

          • Ikepuska
          • 12 months ago

          As mentioned, I plan on a GTX 1060 or 1070/ti in the build. I’m using it for a living room gaming machine, not a general purpose HTPC. The case uses a riser to get the graphics card rotated 90 degrees, and can handle the card if I keep its temp down and/or use a blower style cooler.

          The TDP limit on the CPU is more because of the limited space for a CPU cooler, the GPU has a bit more to work with. Especially if I undervolt a little to save heat.

          [url<]http://zaber.com.pl/sentry/[/url<] This is the case I'm planning on going with.

      • Chrispy_
      • 12 months ago

      The intel 8th gen six-core parts are impressive performers but if you can’t afford a 65W i7-8700, you should probably get a Ryzen 5 2600 unless gaming is the only thing you care about.

      The Ryzen 5 2600 matches the clockspeed of the i5-8500 pretty closely, both chips are boosting at around 3.8GHz when 6 cores are active. With cumulative patches for Spectre/Meltdown the 10% IPC advantage that Intel used to have is all but gone. It’s a shame TR haven’t tested a vanilla 2600 but other sites have it running games [url=https://tpucdn.com/reviews/AMD/Ryzen_5_2600/images/perfrel_1920_1080.png<]about 7% slower than the i5 at 1080p[/url<], and it's obviously GPU-limited as you increase resolution. When you want to do more than just gaming, the 2600 makes a lot of sense. 12 threads beats 6 for most productivity and if you're streaming or doing other things in the background whilst gaming, the 2600 is going to provide a smoother experience - especially now that newer games are starting to take advantage of higher core counts and showing performance separation between the 16T i9-9900K and 8T i7-9700K. If you go AMD, you'll want to pair it with Samsung RAM for the best results, since Ryzen is very sensitive to RAM speeds. [url=https://www.reddit.com/r/Amd/comments/62vp2g/clearing_up_any_samsung_bdie_confusion_eg_on/<]Here's a thread[/url<] with which kits you should be looking at. Don't pay too much of a premium for the RAM, though - you'll be making a mistake, because the i7-8700 with affordable DDR4-2666 is always going to faster than a Ryzen 5.

        • Ikepuska
        • 12 months ago

        This is going to be a dedicated living room computer. Multi-tasking is not really going to be a thing. I’m going to be using a wireless trackball and keyboard, and controllers with it.

        It’s usually going to be used for games like FFXV and Dark Souls 3, and Shadows die twice when it comes out and the like. I’m eventually going to pair it with a 4k TV with WCG and HDR, so a dedicated GPU is required, but the only other use case will be streaming Crunchyroll and Netflix, and that’s just not really taxing for the system.

        I’ve got a different productivity machine which is a little older in parts but still an excellent performer for what I do at home, it’s got an OCed 6700k in a Z270 board with 32 GB RAM, which serves my needs well for the time being.

        I’ve got a fair bit of give on the budget, ~$600 allocated to the MB, Chip and RAM. So it’s more the question of if stepping up to higher cost parts for each of those is worth it based on $/perf.

        In one of my replys up above i’ve got two sample builds sketched out just for reference, I’m just looking for more information on the locked parts and how they compare in gaming only.
        They just don’t get reviewed much, so I can’t see what the difference is between the 8400 and the 8600 in gaming for example. And most reviews are of the 2600x.

        It’d be great if there was a ‘midrange CPU’ review that explicitly compared the various locked parts to each other, and with various memory, so that I could do a more proper cost benefit analysis.

          • sdch
          • 12 months ago
            • Ikepuska
            • 12 months ago

            I’ve got an old NCASE M1 build that had the motherboard die on me, and that started all of this. I’m going to reuse the NCASE for the moment, but I found out about the Sentry when looking into new parts for a SFF build, and my significant other loves it a great deal more than my M1, so that’s going to be that.

      • ronch
      • 12 months ago

      Posts like this should be in the forums, I think.

        • Ikepuska
        • 12 months ago

        I figured originally I’d just get a response or two with at most 1 or two lines, and this article seemed a relevant place for the idea initially. Then it expanded beyond my initial intent. If there was a way to relocate it to the forums I’d definitely do that. Oh well, live and learn.

    • Arbiter Odie
    • 12 months ago

    EDIT: Wrong article, oops!

    • rahulahl
    • 12 months ago

    Just in case you are looking for more CPU bound games, Warhammer: End Times – Vermintide is an excellent example of a CPU bound game. Not sure about how easy it is to benchmark though.

      • Ninjitsu
      • 12 months ago

      can i add Arma 3

      • Ifalna
      • 12 months ago

      Just grab World of Warcraft or Final Fantasy XIV in a populated city. Doesn’t get more CPU bound than that.

    • synthtel2
    • 12 months ago

    Most people seem to overcomplicate CPU/GPU balance, resolution, and so on. I mainly use two numbers for these decisions: the target framerate, and the framerate you think you’ll actually be able to get GPU-side. Neither is tough to calculate. Take the lower of the two and you’ve got a single number which pretty well represents how much CPU power you’re looking for.

    I think people don’t like that answer because it says you don’t need much CPU to drive a 60 Hz monitor (or a 4K rig that’s only getting 60 GPU-side) and getting to 144 consistently is impossible. That’s all realistic, though.

    [quote<]We have to dig deep into our Steam libraries for titles that care much about the CPU they run on when we're getting set up for CPU reviews.[/quote<] If 60 fps is your target, sure, but there's no shortage of games that won't do a solid 144 no matter what CPU you throw at them. [quote<]At the opposite extreme, do you play esports titles at the lowest possible resolution and highest possible frame rates? You'll want to pick a CPU with fewer cores and threads and higher per-core clock speeds—possibly one that's unlocked for easy overclocking.[/quote<] Don't go for a 2700X, of course, but I'm a little more wary of 6C6T than you seem to be when CPU framerates are critical. 6C12T or 8C8T aren't worth big boosts, but when a 5.3 GHz 9900K with the fastest RAM money can buy still wouldn't quite do what you want, any boost you can get without compromising stability or longevity is a big deal. [quote<]Shortages of Intel's 14-nm processors have recently driven up the i5-8400's price at retail, but we still think it's the best midrange gaming processor around. If you can't find stock of the i5-8400 at a reasonable price, though, read on.[/quote<] This seems like a small note to counter the 8400's high placement in the article overall. Realistic prices right now look to be $250-280 when it's in-stock at all, and that's a raw deal when 2600Xs can be had for $230 or less and 9600Ks are within spitting distance.

      • auxy
      • 12 months ago

      [quote<]Most people seem to overcomplicate CPU/GPU balance, resolution, and so on. I mainly use two numbers for these decisions: the target framerate, and the framerate you think you'll actually be able to get GPU-side. Neither is tough to calculate. Take the lower of the two and you've got a single number which pretty well represents how much CPU power you're looking for.[/quote<]Well I think you're grossly oversimplifying it! Particularly curious considering you're making this post on the very website that invented sub-frame performance measurement. ('ω') I dunno, maybe you're not a big action gamer, or maybe you're not acquainted to playing on a low-latency display. The difference between an i7-8700K and a Ryzen 7 1700X, both equipped with the same RX 580 8GB card, both running with fast DDR4-3200 memory, outputting to a 4K monitor (surely the GPU is the hard limiter here?) is still perceptible to me. That's in GTA Online and in Warframe. The difference is even bigger with a 240Hz display. Keep in mind you can use DSR to run 4K at high refresh rates on a lower-resolution display. I do this a lot on my 1080 Ti. Sure, not all games will behave that way and definitely there are plenty of games that won't care about the slower CPU, but saying some crap like "just worry about what your GPU can do" is garbage. Bottlenecks don't exist in essence over an entire system, an entire game, or even a per-session basis. Bottlenecks and performance gaps happen on a sub-frame basis just like any other performance consideration and having a slow CPU can prevent the rest of your hardware from reaching its potential (and thus hamper your experience) even if the GPU is the primary limiting factor.

        • synthtel2
        • 12 months ago

        I did skim over some things, but I’ll stand by the concept. The core of it is that CPU and GPU performance interact with each other a lot less than people think they do. If one of them is meeting or failing to meet your performance targets, that isn’t going to change if the other changes. The real tricks are figuring out what your performance targets are and what’s actually impacting your ability to reach them, but those are both much simpler problems than they look the way most people present it.

        There are a variety of types of performance targets, of course, but most people even here are most comfortable thinking about average framerate, and it’s a pretty good proxy for the rest now. Frame delivery is usually far smoother than when TR started testing this stuff, and latency also correlates pretty strongly with framerate. Latency is complex, but accounting for frametime variance if your standards for it are particularly strict usually just means moving the CPU framerate target upwards a bit.

        I use a 144 Hz monitor, the game I play most is twitchy enough that 7ms of extra latency makes me noticeably worse at it, and I often find games that nobody else seems to have a problem with unplayable for latency reasons. Uneven input sampling also bugs me, but I’ll grant I’m not the most sensitive to uneven frame delivery at the display. I am usually CPU-bound with a 1700, 480, and 1440p, so your situation doesn’t sound that weird to me, nor does it sound like the GPU should necessarily be any kind of hard limiter. Since you didn’t mention what kind of framerates you’re actually getting out of it though, I have no way to judge.

        With a 60 Hz monitor and most games, I do feel pretty confident saying that most people won’t notice a difference between a 2600X and 9900K. That’s *very* different from “just worry about what your GPU can do”. I was going for something more like “if your goal is 16.7 ms 99.9% frametimes and your CPU is already capable of that in the games you play, don’t expect a CPU upgrade to take your GPU minimums from 58 to 62.”

        [quote<][...] and having a slow CPU can prevent the rest of your hardware from reaching its potential (and thus hamper your experience) even if the GPU is the primary limiting factor.[/quote<] Preventing the hardware from reaching its full potential isn't the same as hampering your experience, though. No matter what you do, there will always be some amount of potential performance left on the table in every frame. "Is this upgrade a good match for the rest of my components?" is a much less important question than "will this upgrade make my gaming experience better?" It sounds like your top priority is consistent frametimes including in multiplayer games, so it makes sense that you'd want something CPU-heavy and GPU-light. My top priority is latency reduction and I'm often stuck with a flip queue of 3, so I like the inverse. Both of us are leaving a lot of theoretical performance on the table, but that doesn't mean they aren't the right builds for us.

    • Kretschmer
    • 12 months ago

    Has anyone found a set of good i3-8100 vs 2200G benchmarks? It seems odd to pit an APU vs CPUs for raw CPU performance.

    And yes, $30 more is 30% more for the CPU, but it’s unlikely to be a large proportion of the overall build.

      • synthtel2
      • 12 months ago

      The 8100 is an APU too, Intel just doesn’t call theirs that. 😉

        • derFunkenstein
        • 12 months ago

        Yep, as long as there’s an integrated GPU. 🙂

    • Shobai
    • 12 months ago

    Can I recommend you add a price to the i5 8400’s section? It is the only one that doesn’t currently have a price.

    • Kretschmer
    • 12 months ago

    I’m amused that six physical cores are “don’t intend to do much heavy lifting with your PC.” The world is going to hell, but at least CPU cores have proliferated. Thanks, Obama!

    • Ninjitsu
    • 12 months ago

    Much needed article, thanks a lot!

    • Firestarter
    • 12 months ago

    I’m very happy with my 9700K so far, mine really likes to clock as well but when loading up 8 cores at 5.2ghz the power consumption is insane. Intel, AMD and Nvidia really did well with their adaptive voltage/clock strategies IMO, the extra performance we can get by OCing is nice but it comes at a cost. The built in voltage curves are already pretty close to optimal as far as I can tell

      • Krogoth
      • 12 months ago

      That’s pretty much why CPU overclocking is pretty much dead and is only around for bragging rights. High-end SKUs are already near the voltage/power limit and clockspeed ceiling of the silicon at stock.

        • Firestarter
        • 12 months ago

        agree, most people would be fine with MCE and upping the power limits a bit, manually overclocking doesn’t get much more performance these days

    • Ifalna
    • 12 months ago

    Heh, and here I am still sitting next to my 3570K and feel no need to upgrade.

    I mean, I would but having to buy a new board and RAM in addition to a CPU makes it not worth it to me.

      • Gastec
      • 12 months ago

      I’m in the same boat. But I already see it, when I will decide to upgrade, the prices will be double what they are now.

      • LoneWolf15
      • 9 months ago

      Same for my i7-4790K. And my 32GB of DDR3 came cheap.

    • HERETIC
    • 12 months ago

    With the Red Headed Stepchild (8600K) sitting within 1 to 2% of the best @ 1440,
    was looking forward to the 9600K, in AT’s review it seemed to stick like glue to the 8600,
    with hardly any benefit of the xtra clock speed,maybe Meltdown in hardware comes at
    a price…………………………..

    • JustAnEngineer
    • 12 months ago

    You know…

    It has been five months since TR published a system guide.

      • Wonders
      • 12 months ago

      This article is a cool format, which I was surprised to find more enjoyable than the corresponding segment of a system guide. Not as practical/functional, but more fun to bite into.

        • JustAnEngineer
        • 12 months ago

        System guides have been writtten with this approach. A number of styles have appeared in the system guide’s thirteen year history.
        [url<]https://techreport.com/forums/viewtopic.php?f=33&t=47062[/url<]

    • drfish
    • 12 months ago

    I warned Jeff before he posted this, simple explanations and alternating recommendations between AMD and Intel at logical performance thresholds would [i<]not[/i<] sit well with either camp's shills. Publicly endorsing both company's strengths is a grave mistake, mark my words.

      • derFunkenstein
      • 12 months ago

      Quite a toothteller. Soothsayer. Whatever.

    • Chrispy_
    • 12 months ago

    Roger.

    2600X or 8400 and spend the change on a better GPU.

    Also +1 for good article.

      • Gastec
      • 12 months ago

      On Amazon:
      Core i5-8400 6cores/6 threads, [i<]2,80 to 4 GHz[/i<] : $200 Core i5-8600 6cores/6 threads, 3.10 to 4.30 GHz : $234 Core i5-8600K 6cores/6 threads, 3.60 to 4.30 GHz : $260 Ryzen 2600X 6cores/12 threads, 3.60 to 4.20 GHz : $210 Core i5-9600K 6cores/6threads, 3.70 GHz to 4.60 GHz : $280 Core i7-9700K 8cores/8threads, 3.60 GHz to 4.90 GHz : [b<]$410[/b<] (out of stock 🙂

    • blargh4
    • 12 months ago

    “Past that, we expect the CPU landscape to remain stable through the end of the year”

    I feel it’s also worth noting in the summary that the 7nm Zen2 CPUs should be announced early next year, and will bring both major process and micro-architectural improvements, so even if they don’t become the new gaming CPU of choice, the market may shift quite drastically within a couple of months, if you’re not in a rush to upgrade.

      • Krogoth
      • 12 months ago

      AMD is going to be focusing its 7nm Zen2 towards SMB/Enteprise customers first before making distilled versions for the regular desktop/laptop markets.

      I expect Zen2 desktop SKUs to at least hit 4.0Ghz normal and turbo boost up to ~4.5-4.6Ghz plus with some microarchitecutre improvements. They should be at least on par or close to it with Coffee Lake at gaming and low-threaded applications.

        • Kretschmer
        • 12 months ago

        Wait, AMD makes laptop chips?

          • Krogoth
          • 12 months ago

          Yes, with their recent Ryzen Mobiles and the older Fusion FX APUs that were commonplace in the budget side of the portable market.

    • Krogoth
    • 12 months ago

    8600K and 8700K are honorable mentions if you can’t find 9700K and 9600K in stock.

    The 8400 is okay but its turbo speed will end-up croaking it once you scale beyond four threads. The clockspeed advantage it has over the 2600 evaporates.

      • Jeff Kampman
      • 12 months ago

      There’s clearly more to the i5-8400’s advantages in games, though; else the 2600X would clobber it 100% of the time.

        • Krogoth
        • 12 months ago

        That’s only for games for today and stuff the past. In the future, it is quite possible that some game developers will begin to move beyond four threads.

        The 8400 does indeed start to fall behind and only at best matches the 2600/2600X when it is pushed beyond four threads (seen with production suites) because of the limited turbo speed.

        8400 is a compromise. It makes sense if you need strong CPU gaming performance now on a budget but aren’t too concern about CPU gaming performance in the future titles. At that point, you’ll probably upgrade to a newer CPU and platform.

          • Jeff Kampman
          • 12 months ago

          OK, but does this hypothetical future look more like [i<]Doom[/i<] or does it look more like [i<]Crysis 3[/i<]? If the future is more [i<]Doom[/i<]-esque then it seems like anything with a clock signal will be able to run games competently with any graphics card. If the future was in [i<]Crysis 3[/i<]-esque titles then API advances wouldn't seem to matter. We'd already be consistently exploiting parallelism on the CPU for PC games. Even a couple years into the era of DirectX 12, I haven't seen a single recent game that uses every core and thread of the CPU to consistently deliver higher performance (looking at you, [i<]Assassin's Creed[/i<]); if anything, [i<]Gears of War 4[/i<] suggests it's just as easy not to write parallel code in next-gen APIs as it is with DirectX 11. People should buy what delivers the best performance today, all else being equal.

          • Kretschmer
          • 12 months ago

          We’ve been hearing that games will crave MOAR COARS since bulldozer launched. It’s very unlikely that games of the future will be more lenient on per-core performance AND want more than six hardware threads. I would happily bet that that six fast cores will beat eight slow cores in 2020’s games.

            • Krogoth
            • 12 months ago

            It is already happening though. Quad-core CPUs without HT are already beginning to struggle with some current games (seen with hic-ups in frame-times) where there’s isn’t enough free cores for OS’s scheduler to dance around with other background processes.

            It’ll only get worse as mainstream six-core CPUs or greater SKUs become more and more commonplace. It is history repeating itself. We saw the same thing happening to dual-core CPUs back when quad-core CPUs started becoming commonplace.

            • Gastec
            • 12 months ago

            That’s the future alright, 4 cores for the game and 4 cores for Windows’ bloatware and adware.

        • derFunkenstein
        • 12 months ago

        Yes, he forgot that Intel’s cores are faster per-clock than anything AMD has released (still). It’s like you’ve been preaching the last year and a half: having more of them makes up for it in some workloads and not others, so pick the CPU that works for your needs.

          • Krogoth
          • 12 months ago

          In the 8400’s case, the clockspeed advantage evaporates when it is forced to use more than four cores due to how its turbo speed is setup. It was intentionally gimped in that matter so it wouldn’t end-up cannibalizing 8600K and 8700K.

            • Mr Bill
            • 12 months ago

            Thus, “Win one for the Gimper”.

            • derFunkenstein
            • 12 months ago

            “Evaporates” = “max turbo drops 200MHz”

            [url<]https://www.tomshardware.com/reviews/intel-coffee-lake-core-i5-8400-cpu,5281.html[/url<] What you didn't read correctly is that I said that Intel's cores are faster PER CLOCK. Intel is giving up 5% six-core clock compared to AMD's 6-core clock of 4GHz on the 2600X. That's nothing compared to just how much faster per-clock Intel's Skylake++ core tends to be.

      • Voldenuit
      • 12 months ago

      Does Multi Core Enhancement work on locked intel CPUs, or can you not tinker with the multiplier at all?

        • Krogoth
        • 12 months ago

        I believe that MCE only works with “K” series chips so 8400 users would be SOL.

Pin It on Pinterest

Share This