i7-7700K vs. i7-2600K
Surely you've already read our i7-7700K review, but if not, it would be a good idea to catch up before reading further. The details below aren't anywhere near as comprehensive as our standard suite of testing. I'll be talking about the results of just one benchmark and my own personal pair of chips. I'm going for real-world results, as one might compare their own two builds post-upgrade. We're talking about completely different systems and even slightly different driver versions here. The only things in common are the video card, the operating system (64-bit Windows 10, of course), and the benchmark itself.
Before I get all excited about the actual benchmark details, I'm going to bust out a dirty ten-letter word: subjective. That's not generally how we roll at TR, so I'll avoid claiming anything definitively without the data to back it up. That said, the "seems faster" factor is real, even though I wasn't expecting it to be noticeable. It's there, even just in normal day-to-day use.
I'm not alone in noticing this, either. Our resident code monkey, Bruno Ferreira, jumped from an i5-2500K to an i7-6700K last year and spammed the TR Slack channel about how awesome it was. That happened even without the hot-clocked memory and NVMe SSD in my new build. Perhaps the most surprising boost is how the lowly gem Rimworld benefits from my new hardware. My colonists can now raise substantially larger Emu armies before frame rates suffer. Excellent.
Bohemia Interactive's DayZ is where I spend most of my gaming time, however, and its performance can be approximated well with Yet Another Arma Benchmark, an AI-heavy scenario for Arma III made by community member Greenfist. This is the same benchmark that Jeff used to tease performance differences from various memory speeds in his i7-7700K review. That brings us back to my feelings about Ryzen. I suspect Kaby Lake will have a significant leg up on AMD's latest when it comes to the games that I'm looking for better performance in. Let's see what my upgrade bought me in Arma III and DayZ.
For this test, all of Arma's graphics settings were cranked up to the max, except for FSAA. My i7-2600K was running at 4.2 GHz and my i7-7700K was running at 4.8 GHz, which is as far as I've pushed it so far. (Side note: I think I got a pretty good Kaby chip, and I may have won the TIM lottery, as well.) I'm using 16GB of DDR3-2133 with my i7-2600K and 16GB of DDR4-3733 with my i7-7700K. My GTX 980 Ti is the same in both cases but the SSDs, motherboards, PSUs, cases, and cooling hardware are all different. However, I don't think that any of those are major contributors to the performance delta between the systems. On to the results.
Ouch. This notoriously poorly-performing title from 2013 puts the hurt on my old system. Even at 1920x1080, it doesn't get over 45 FPS and averages just over 30 FPS. Jeff's unofficial testing for the i7-7700K review placed a stock i7-3770K (3.5 GHz) with DDR3-1866 and similar settings at about 35 FPS when paired with a GTX 1080, so we at least know we're in the ballpark of what to expect. Let's see what six years of ticking and tocking gets us.
Yowza! Nearly double the average FPS from "just" a CPU upgrade? That's not normal. Only a game engine as goofy as Arma III's could produce that result. I'll take it, though. This jump in performance is a huge quality of life improvement and I can confirm that, subjectively, it carries over to DayZ, as well. What's that, you say? None of that matters because 1920x1080 is for chumps and if I was playing at a proper resolution the CPU wouldn't matter nearly as much? Well, let's check out one more result, then: this time at 3440x1440.
You're looking at just a 2.5-FPS drop after an approximately 2.5x increase in the number of pixels being pushed around. There's a saying for this sort of thing: That's Arma! I guess my GTX 980 Ti isn't out of a job yet. In DayZ, the results have always been similar. In the game's worst-performing areas, it didn't matter if you ran 1080p with everything on low or at 4K with AA on high, your frame rates were going to be in the low-teens no matter what, and your GPU wouldn't even break a sweat.
These results are relevant to my upgrade story because as recently as early last summer, before the DX11 renderer and version 0.60 of the game were released, my trusty i7-2600K and GTX 980 Ti combo was nigh-unplayable in many areas of the map. The 0.60 update would ultimately double or triple the frame rate in the most troublesome areas, and my new PC is frequently doubling those numbers yet again. All with the same graphics card, and all while Bohemia Interactive is improving the look of the game with dynamic shadows and lighting. Cumulatively, it's completely transformative. I didn't know if I would ever see the day when DayZ ran buttery smooth. Such a thing was practically unachievable just eight months ago, no matter what hardware you had or how much money you spent.
Make no mistake: I understand that my experience is clearly a niche case. Most games aren't as CPU-limited as DayZ and Arma III. However, the lesson I learned after chasing perfect DayZ performance for years is that sometimes we just need CPUs to get things done now. In those cases, there's no substitute for clock speed, IPC, and, apparently, even memory bandwidth and latency. I was guilty of underestimating the importance of the CPU in the overall system, especially since I've always gamed at above-average resolutions where conventional wisdom suggests the CPU matters even less.
All told, I stretched my Sandy Bridge system too far. I could have experienced most of the improvements I've seen in the last month earlier if I had jumped on Skylake when it came out. Then again, Z270 seems to be better-suited to super-fast memory than Z170, so maybe waiting was the right choice. What about Ryzen? Well, we'll just have to see how that pans out. I don't think I'll regret my choice, but I geared my upgrade toward a specific task. You win some benchmarks, you lose some. Other builds will be better suited for other needs. All told, though, I'm happy with my decision, and if you're wondering whether it's time to upgrade your own Sandy system, the answer is probably "yes."
161 comments — Last by Ninjitsu at 7:53 AM on 05/01/17
|AMD's Epyc 7000-series CPUs revealed Zen gets its data center marching orders||140|
|Intel's Core i9-7900X CPU reviewed, part oneVying for a perfect 10||166|
|Computex 2017: Gigabyte's latest and greatest gearMotherboards and eGPUs and laptops, oh my||19|
|AMD's Ryzen 5 CPUs reviewed, part twoGetting down to business||171|
|Intel's Core X-series CPUs and X299 platform revealedSkylake-X and Kaby Lake-X make their debut||245|
|The Tech Report System Guide: May 2017 editionRyzen 5 takes the stage||111|
|AMD's Ryzen 5 1600X and Ryzen 5 1500X CPUs reviewed, part oneGetting our game on||192|
|A moment of Zen with David Kanter: The TR Podcast 190Digging into the whys of Ryzen||39|
|Silverstone's Strider Titanium PSUs are ready for a high-power future||6|
|VR180 video bridges the gap between YouTube and VR||0|
|Steam 2017 Summer Sale, part deux||13|
|Deals of the week: Z270 mobos, spinning storage, and more||3|
|G.Skill readies up for X299 with quad-channel DDR4 at 4200 MT/s||12|
|Asus' VivoBook S510 is an ultrabook for the budget crowd||13|
|Windows Insider Build 16226 gives users a look at GPU utilization||22|
|Steam's 2017 Summer Sale is downright hot||46|
|Asus XG-C100C NIC breaks the gigabit barrier||34|