Personal computing discussed

Moderator: Hoser

 
DeadOfKnight
Gold subscriber
Gerbil Elite
Topic Author
Posts: 718
Joined: Tue May 18, 2010 1:20 pm

CPU vs GPU Bottlenecks

Sun Dec 09, 2018 7:55 pm

Over the many years I've been upgrading my PC for smoother gaming performance, I've heard it suggested many times here and elsewhere to get a mid-range CPU and the fastest GPU you can afford. While this has often been the case in terms of raw fps numbers, I find that games that are GPU-bound run relatively smoothly even if less than 60 fps average. CPU-bound games, however, seem to be less smooth even above a 60 fps average. This trend has only become more apparent as games have started utilizing 8+ threads. This can easily be observed in benchmarks that seem to love the new AMD chips even though they are 20% slower in clock speed.

My point is that it really seems to me that the #1 priority should be to minimize the CPU bottleneck, and then get the fastest GPU you can afford with the rest. CPU bottlenecks are just more nightmarish to try to game with than GPU ones. Average frame rates notwithstanding, you get more drops, stutters, and laggy feeling game-play in CPU-bound games. Plus it's just that much easier and more tempting to upgrade the GPU sooner than the CPU. So, when recommending a build for someone here in the forums, I think it's good service to recommend one of the best CPUs for gaming, even if that means GPU budget will be lower. What do you all think?
Last edited by DeadOfKnight on Sun Dec 09, 2018 9:17 pm, edited 1 time in total.
Intel Core i7-5775c, Asus Maximus VII Formula, Win 10 Pro
RTX 2080 Ti FE, Corsair 2x8GB DDR3-1866, Corsair AX860
Corsair H105, WD Red 4TB x2, Creative X-Fi Titanium HD
Samsung 850 EVO 1TB, Alienware AW3418DW, Corsair 450D
 
synthtel2
Gold subscriber
Gerbil Elite
Posts: 866
Joined: Mon Nov 16, 2015 10:30 am

Re: CPU vs GPU Bottlenecks

Sun Dec 09, 2018 8:30 pm

Being CPU-bound gets you less consistent frametimes, as you're seeing, but being GPU-bound often increases latency. There's a lot of personal preference in it. Getting both consistent frame delivery and ideal latency for a given game often takes a framerate limiter, but a lot of gamers recoil at the thought of using one.

I can tolerate a lot of sloppiness in frametimes, but very little in latency, so I prefer to be CPU-bound. That seems to put me in a minority though.
 
DeadOfKnight
Gold subscriber
Gerbil Elite
Topic Author
Posts: 718
Joined: Tue May 18, 2010 1:20 pm

Re: CPU vs GPU Bottlenecks

Sun Dec 09, 2018 9:22 pm

synthtel2 wrote:
Being CPU-bound gets you less consistent frametimes, as you're seeing, but being GPU-bound often increases latency.

Being GPU-bound also means that tweaking your settings will have a far greater affect on performance. If a game is CPU-bound, you will often find that turning settings down from ultra yields little to no benefit to performance.

I can tolerate lower performance for extra eye candy and vice versa, but I have little tolerance for no control. The ability to tweak and control the experience is what PC gaming is all about IMO.
Intel Core i7-5775c, Asus Maximus VII Formula, Win 10 Pro
RTX 2080 Ti FE, Corsair 2x8GB DDR3-1866, Corsair AX860
Corsair H105, WD Red 4TB x2, Creative X-Fi Titanium HD
Samsung 850 EVO 1TB, Alienware AW3418DW, Corsair 450D
 
Kretschmer
Silver subscriber
Gerbil XP
Posts: 458
Joined: Sun Oct 19, 2008 10:36 am

Re: CPU vs GPU Bottlenecks

Sun Dec 09, 2018 9:33 pm

DeadOfKnight wrote:
This trend has only become more apparent as games have started utilizing 8+ threads. This can easily be observed in benchmarks that seem to love the new AMD chips even though they are 20% slower in clock speed.

Which benchmarks are these? As far as I can see, my 7700K with DDR4 3200 is faster than any AMD CPU yet released in most games. TR even did a whole benchmark on the subject: https://techreport.com/review/33568/gam ... zen-cpus/4

These days, I would probably choose to be GPU bound (within reason) than CPU bound. It's better to turn down some settings and upgrade your GPU in a few years than have games be permanently slow until you rip the beating heart of your computer out in a complex procedure.

That said, an i5 8400 had a 99th percentile FPS of 80 in TR's most recent benchmark, so it's not too tough to avoid being CPU bound.
 
synthtel2
Gold subscriber
Gerbil Elite
Posts: 866
Joined: Mon Nov 16, 2015 10:30 am

Re: CPU vs GPU Bottlenecks

Sun Dec 09, 2018 10:01 pm

I'd agree on control, but in practice I almost always just turn down settings until I'm CPU-bound (or all the way to potato in the case of twitchy FPSes because it still helps latency even if framerates don't improve). My target framerate is always whatever the CPU can manage, and if graphics have to be ugly sometimes to make that happen, oh well.

Arguably I should just get a faster CPU, but I don't think any CPU I could buy would be fast enough to let me handle this differently, and since my current CPU is more than fast enough for everything else I do with it an upgrade would be a lot of $$$ for not much gain.
 
DeadOfKnight
Gold subscriber
Gerbil Elite
Topic Author
Posts: 718
Joined: Tue May 18, 2010 1:20 pm

Re: CPU vs GPU Bottlenecks

Mon Dec 10, 2018 1:17 am

Kretschmer wrote:
Which benchmarks are these? As far as I can see, my 7700K with DDR4 3200 is faster than any AMD CPU yet released in most games.

I don't mean that AMD's chips fare better than Intel's for any kind of gaming, only that some games do like having 8 physical cores over 8 logical cores at much higher clock speeds, so AMD's chips aren't left in the dust in those scenarios. They can compete on value for dollar. This suggests that 8 physical cores could become the new baseline for PC gaming enthusiasts as more and more games like this come out. It was only a matter of time, since the consoles use 8-core AMD Jaguar CPUs.

Speaking of the consoles, I would expect the next generation of gaming consoles to usher in a new CPU-bound revolution. There's a lot that the developers can't do in their games because they have to be able to run on current generation consoles.
Intel Core i7-5775c, Asus Maximus VII Formula, Win 10 Pro
RTX 2080 Ti FE, Corsair 2x8GB DDR3-1866, Corsair AX860
Corsair H105, WD Red 4TB x2, Creative X-Fi Titanium HD
Samsung 850 EVO 1TB, Alienware AW3418DW, Corsair 450D
 
End User
Gold subscriber
Minister of Gerbil Affairs
Posts: 2966
Joined: Fri Apr 16, 2004 6:47 pm
Location: Upper Canada

Re: CPU vs GPU Bottlenecks

Mon Dec 10, 2018 2:29 am

DeadOfKnight wrote:
My point is that it really seems to me that the #1 priority should be to minimize the CPU bottleneck, and then get the fastest GPU you can afford with the rest.

And you believe the i7-5775c is the best way to "minimize the CPU bottleneck" for a 2080 Ti FE driving a 3440x1440 120Hz G-SYNC display?
 
Topinio
Gerbil Jedi
Posts: 1641
Joined: Mon Jan 12, 2015 9:28 am
Location: London

Re: CPU vs GPU Bottlenecks

Mon Dec 10, 2018 7:30 am

Mostly agree with OP for building new, but then I personally prefer to build new (motherboard, CPU, RAM; maybe PSU and chassis) using a re-deployed GPU from the last build and to upgrade the GPU later.

For gaming, my flow is to put the settings on as high as I think might be possible, and detune from there -- but if that's not quickly possible for any reason, I turn settings all the way to potato and work up from there until the experience isn't good, then back a notch. Ultimately, it's probably the CPU that's going to be the weak spot on my machines, most of the time.
Desktop: E3-1270 v5, X11SAT-F, 32GB, RX Vega 56, 500GB Crucial P1, 2TB Ultrastar, Xonar DGX, XL2730Z + G2420HDB
HTPC: i5-2500K, DH67GD, 6GB, RX 580, 250GB MX500, 1.5TB Barracuda
Laptop: MacBook6,1
 
The Egg
Minister of Gerbil Affairs
Posts: 2466
Joined: Sun Apr 06, 2008 4:46 pm

Re: CPU vs GPU Bottlenecks

Mon Dec 10, 2018 8:50 am

I thought I remembered reading that just as with the system RAM, the consoles don't necessarily have full access to all 8 cores for gaming. In any case, I think it'd be pretty neat to see TR do a gaming rundown of 4/4, 6/6, and 8/8 Skylake-based CPUs (since we have all those now), all set to the same clockspeed with turbo disabled and a 2080 Ti. Should give a fairly straightforward answer as to the current effect of core-count in games.
 
End User
Gold subscriber
Minister of Gerbil Affairs
Posts: 2966
Joined: Fri Apr 16, 2004 6:47 pm
Location: Upper Canada

Re: CPU vs GPU Bottlenecks

Mon Dec 10, 2018 12:57 pm

DeadOfKnight wrote:
It was only a matter of time, since the consoles use 8-core AMD Jaguar CPUs.

That is a super old/weak architecture now. PC gaming took multithreaded gaming and ran with it ages ago.

DeadOfKnight wrote:
I would expect the next generation of gaming consoles to usher in a new CPU-bound revolution.

That sounds like a terribly ill-conceived product. Are you visualizing crappy CPUs paired with crazy bonkers GPUs?
 
synthtel2
Gold subscriber
Gerbil Elite
Posts: 866
Joined: Mon Nov 16, 2015 10:30 am

Re: CPU vs GPU Bottlenecks

Mon Dec 10, 2018 4:25 pm

End User wrote:
DeadOfKnight wrote:
I would expect the next generation of gaming consoles to usher in a new CPU-bound revolution.

That sounds like a terribly ill-conceived product. Are you visualizing crappy CPUs paired with crazy bonkers GPUs?

The new consoles will probably be 8C Zen2, meaning game devs will feel free to use a lot more CPU power, meaning PC gaming will get more CPU-bound.
 
DancinJack
Maximum Gerbil
Posts: 4262
Joined: Sat Nov 25, 2006 3:21 pm
Location: Kansas

Re: CPU vs GPU Bottlenecks

Mon Dec 10, 2018 4:45 pm

synthtel2 wrote:
The new consoles will probably be 8C Zen2, meaning game devs will feel free to use a lot more CPU power, meaning PC gaming will get more CPU-bound.

What, at 850MHz?
i7 6700K - Z170 - 16GiB DDR4 - GTX 1080 - 512GB SSD - 256GB SSD - 500GB SSD - 3TB HDD- 27" IPS G-sync - Win10 Pro x64 - Ubuntu/Mint x64 :: 2015 13" rMBP Sierra :: Canon EOS 80D/Sony RX100
 
Krogoth
Gold subscriber
Gerbil Elder
Posts: 5594
Joined: Tue Apr 15, 2003 3:20 pm
Location: somewhere on Core Prime
Contact:

Re: CPU vs GPU Bottlenecks

Mon Dec 10, 2018 4:53 pm

synthtel2 wrote:
End User wrote:
DeadOfKnight wrote:
I would expect the next generation of gaming consoles to usher in a new CPU-bound revolution.

That sounds like a terribly ill-conceived product. Are you visualizing crappy CPUs paired with crazy bonkers GPUs?

The new consoles will probably be 8C Zen2, meaning game devs will feel free to use a lot more CPU power, meaning PC gaming will get more CPU-bound.


They will most likely be 4C Zen2 chiplets and be taking up the bulk of the sub-par yields. 6c/8c Zen2 chiplets will either land in regular Zen2-based Ryzen, Threadripper or Epyc land depending on power consumption/clockspeed. Obviously, Zen2-based Epycs will get the "best' chiplets.
Gigabyte Z390 AORUS-PRO Coffee Lake R 9700K, 2x8GiB of G.Skill DDR4-3600, Sapphire RX Vega 64, Corsair CX-750M V2 and Fractal Define R4 (W)
Ivy Bridge 3570K, 2x4GiB of G.Skill RIPSAW DDR3-1600, Gigabyte Z77X-UD3H, Corsair CX-750M V2, and PC-7B
 
dragontamer5788
Gerbil Team Leader
Posts: 200
Joined: Mon May 06, 2013 8:39 am

Re: CPU vs GPU Bottlenecks

Mon Dec 10, 2018 4:54 pm

DancinJack wrote:
synthtel2 wrote:
The new consoles will probably be 8C Zen2, meaning game devs will feel free to use a lot more CPU power, meaning PC gaming will get more CPU-bound.

What, at 850MHz?


I think his point is that game devs will start to use the CPU (maybe more animations, for example) in standard AAA games, which will cause more games of the future to rely on CPU when they're ported to the PC. Currently, there are 8x Jaguar cores on the PS4 / Xbox, which are so slow that most video games today have a variety of tricks to drop CPU power down significantly.

That is: Synthtel2 seems to think that MS Windows games will be more CPU bound, as CPUs on consoles get stronger.
 
DancinJack
Maximum Gerbil
Posts: 4262
Joined: Sat Nov 25, 2006 3:21 pm
Location: Kansas

Re: CPU vs GPU Bottlenecks

Mon Dec 10, 2018 5:07 pm

dragontamer5788 wrote:
I think his point is that game devs will start to use the CPU (maybe more animations, for example) in standard AAA games, which will cause more games of the future to rely on CPU when they're ported to the PC. Currently, there are 8x Jaguar cores on the PS4 / Xbox, which are so slow that most video games today have a variety of tricks to drop CPU power down significantly.

That is: Synthtel2 seems to think that MS Windows games will be more CPU bound, as CPUs on consoles get stronger.

Aye, I understood, I just think Synthtel may have been overly optimistic about what kind of CPU they can fit into required TDP range for a next gen console. Which also leads to the performance impact. And so on.

https://www.anandtech.com/show/11992/th ... x-review/6

The Xbox One X, which features 8 Jaguar (yeah, that old) cores @ about 2.3GHz at load, draws as much as ~170W, but generally a lot lower than that. I'd love for them to fit full-fat 8 core Zen/whatever cores into a next gen console and actually move them closer to PCs, but I just don't see that happening quite yet.

edit: just so I don't get dunked on, I don't think they're going to go full-fat Zen2 or whatever you might call it. Everything in the consoles is surely going to be semi-custom ("based on" Zen2 and Vega/whatever is next), but they're going to have to cut them down quite a bit to fit them in a decent TDP. That's all I'm getting it. It will still definitely be better than friggin' Jaguar cores.
i7 6700K - Z170 - 16GiB DDR4 - GTX 1080 - 512GB SSD - 256GB SSD - 500GB SSD - 3TB HDD- 27" IPS G-sync - Win10 Pro x64 - Ubuntu/Mint x64 :: 2015 13" rMBP Sierra :: Canon EOS 80D/Sony RX100
 
dragontamer5788
Gerbil Team Leader
Posts: 200
Joined: Mon May 06, 2013 8:39 am

Re: CPU vs GPU Bottlenecks

Mon Dec 10, 2018 5:16 pm

DancinJack wrote:
dragontamer5788 wrote:
I think his point is that game devs will start to use the CPU (maybe more animations, for example) in standard AAA games, which will cause more games of the future to rely on CPU when they're ported to the PC. Currently, there are 8x Jaguar cores on the PS4 / Xbox, which are so slow that most video games today have a variety of tricks to drop CPU power down significantly.

That is: Synthtel2 seems to think that MS Windows games will be more CPU bound, as CPUs on consoles get stronger.

Aye, I understood, I just think Synthtel may have been overly optimistic about what kind of CPU they can fit into required TDP range for a next gen console. Which also leads to the performance impact. And so on.

https://www.anandtech.com/show/11992/th ... x-review/6

The Xbox One X, which features 8 Jaguar (yeah, that old) cores @ about 2.3GHz at load, draws as much as ~170W, but generally a lot lower than that. I'd love for them to fit full-fat 8 core Zen/whatever cores into a next gen console and actually move them closer to PCs, but I just don't see that happening quite yet.


Raven Ridge 2500U is 4c / 8t at 2GHz and 25W TDP (including the iGPU). Double that, and you're ~50W for 8c at 2GHz Zen.

If Zen 2 is used on the console, there's a large chance that it'd be part of AMD's mass-production / chiplet 7nm strategy. A big I/O chip to unify the mass-produced CPU (8c/16t Zen2 7nm dies) with iGPU. I'm not necessarily saying there's any evidence that AMD is doing this for the PS5 / XBox-next, but... it just seems to make more sense to make a custom PS5 / XBox-Next I/O die (at 14nm nonetheless) and leverage the 7nm + packaging.

Alternatively, maybe PS5 / XBox-Next have a custom GPU on the I/O die which communicates to AMD's mass-produced 7nm chiplet.

In any case, AMD is proving that we've entered the age of chiplets. And thinking in terms of chiplets, it means AMD will do everything in its power to mass-produce the 7nm 8-core Zen2 die and apply it everywhere.
 
synthtel2
Gold subscriber
Gerbil Elite
Posts: 866
Joined: Mon Nov 16, 2015 10:30 am

Re: CPU vs GPU Bottlenecks

Mon Dec 10, 2018 5:41 pm

An R7 1700 is 8C, 3.2 GHz (in practice), and ~65W (fairly accurate in practice). They're doubling SIMD width from that, but it's also a much better process. I wouldn't be surprised if they still can't quite make it to 3 GHz (depending on the process' V/F curve), but 2.5 should be easy and 3.0+ is plausible.

The CPUs PC gamers typically look at are terrible at efficiency because they're clocked to the moon. The hardware is capable of far better.
 
DeadOfKnight
Gold subscriber
Gerbil Elite
Topic Author
Posts: 718
Joined: Tue May 18, 2010 1:20 pm

Re: CPU vs GPU Bottlenecks

Mon Dec 10, 2018 7:04 pm

End User wrote:
DeadOfKnight wrote:
My point is that it really seems to me that the #1 priority should be to minimize the CPU bottleneck, and then get the fastest GPU you can afford with the rest.

And you believe the i7-5775c is the best way to "minimize the CPU bottleneck" for a 2080 Ti FE driving a 3440x1440 120Hz G-SYNC display?

Why do you think this topic came to mind? I know what being CPU-bound feels like for a lot of newer games. It's still not the case for 99% of games in my library, but it is for a handful, and it sucks more than being GPU-bound did before the upgrade.

My argument is that maybe you can get by with 45 fps and not go for that shiny new GPU, but if you stick with that old quad core then you're gonna have a hard time with these new games. Some games that are CPU-bound run at over 60fps average, but you would not think it was performing well by the experience.

But what do I know? This is my subjective impression. I could be wrong. Maybe a lot of "CPU-bound" games are just poorly optimized for any configuration. What I do know is if I spent that money on a CPU upgrade this year, I could have probably gotten a bigger jump in GPU performance next upgrade cycle. But instead I'll get a CPU upgrade next time. I'll be lucky to get a 5% increase over the products that are already available today.

Point is, instead of the GPU being king for gaming, maybe it's the CPU. Maybe some recommendations we make are wrong.
Intel Core i7-5775c, Asus Maximus VII Formula, Win 10 Pro
RTX 2080 Ti FE, Corsair 2x8GB DDR3-1866, Corsair AX860
Corsair H105, WD Red 4TB x2, Creative X-Fi Titanium HD
Samsung 850 EVO 1TB, Alienware AW3418DW, Corsair 450D
 
Kretschmer
Silver subscriber
Gerbil XP
Posts: 458
Joined: Sun Oct 19, 2008 10:36 am

Re: CPU vs GPU Bottlenecks

Mon Dec 10, 2018 10:18 pm

You still want to make balanced recommendations, with the caveat that builds should be planned with any upgrade cycles in mind. Upgrading a CPU or GPU later is cheaper than replacing both, and generally the GPU tech moves more quickly than processors.
 
JustAnEngineer
Gold subscriber
Gerbil God
Posts: 18402
Joined: Sat Jan 26, 2002 7:00 pm
Location: The Heart of Dixie

Re: CPU vs GPU Bottlenecks

Mon Dec 10, 2018 10:45 pm

I like the idea of timing your CPU and GPU upgrades based on their own merits rather than necessarily upgrading both at the same time.

Since Sandy Bridge launched in the second week of 2011, the pace of CPU+motherboard+memory upgrades was much much slower than the pace of graphics card upgrades. Then Ryzen happened in March of 2017, and we suddenly jumped from 4-core to 8-core gaming CPUs. Once Intel finally gets out of its 14++++++++++++++++++ nm funk, we'll see a long-delayed manufacturing process upgrade for another tangible CPU improvement.

I chose the $410 Core i7-9700K for my most recent gaming CPU upgrade, but if you can wait until the Ryzen 3000 series CPUs appear, that should shake things up and offer some better values.
i7-9700K, H100i v2, Z390M Pro4, 32 GiB, RX Vega64, Define Mini-C, SSR-850PX, C32HG70, RK-9000BR, MX518
 
End User
Gold subscriber
Minister of Gerbil Affairs
Posts: 2966
Joined: Fri Apr 16, 2004 6:47 pm
Location: Upper Canada

Re: CPU vs GPU Bottlenecks

Tue Dec 11, 2018 3:22 am

I've definitely held onto a CPU long enough for it to become a bottleneck.

I kept my OC'ed 3770K system for 5 years. That rig started with a GTX 580 SC. I then upgraded to dual GTX 770 SC in SLI and then finally to a GTX 1080 FE. I knew the GTX 1080 FE was restrained by the CPU in many games but the FPS increase was still worth it. Instead of upgrading the CPU/motherboard/memory I bought a G-SYNC display.
 
DeadOfKnight
Gold subscriber
Gerbil Elite
Topic Author
Posts: 718
Joined: Tue May 18, 2010 1:20 pm

Re: CPU vs GPU Bottlenecks

Tue Dec 11, 2018 4:13 pm

JustAnEngineer wrote:
Once Intel finally gets out of its 14++++++++++++++++++ nm funk, we'll see a long-delayed manufacturing process upgrade for another tangible CPU improvement.

If trends continue, they'll be stuck on 7nm for even longer, and the next die shrink (if it happens) will be even less significant.

Of course, 10 years ago they were telling us that scaling below 20nm would be impossible. Who knows what we will see in the next few years. Once production using EUV tools has matured, it could open up new possibilities. From what I understand, it's multiple patterning yield issues that has been holding us back.

With chiplets and die stacking, we don't need transistors to keep getting smaller to keep reaching higher. Of course, there will still be a point in the future when it will become prohibitively expensive to do so, but all the resources that have been put into fabrication R&D can instead be put into architectural efficiency.

That's not even considering how software could be improved to be more efficient as well to gain more performance that way.

At any rate, it looks to me that we can expect this cadence of years of minor improvements and then a major breakthrough from here on out. GPUs will be the same, they just benefit more from transistor budgets they've been able to keep increasing.

Now if only AMD could put as much pressure back on Nvidia...
Intel Core i7-5775c, Asus Maximus VII Formula, Win 10 Pro
RTX 2080 Ti FE, Corsair 2x8GB DDR3-1866, Corsair AX860
Corsair H105, WD Red 4TB x2, Creative X-Fi Titanium HD
Samsung 850 EVO 1TB, Alienware AW3418DW, Corsair 450D
 
Ifalna
Gerbil Team Leader
Posts: 249
Joined: Sat Jan 28, 2012 11:14 am
Location: Celestis

Re: CPU vs GPU Bottlenecks

Tue Dec 11, 2018 5:15 pm

DeadOfKnight wrote:
Point is, instead of the GPU being king for gaming, maybe it's the CPU. Maybe some recommendations we make are wrong.

Well it is true for MMOs. Boh Final Fantasy XIV and World of Warcraft gobble up CPU power w/o any problem, esp when there are many people around.

So it depends on the game. In most titles you will be GPU limited, esp when using some of the more sane GPU options at 4K.
The backbone of modern industrial society is, and for the foreseeable future will be, the use of electrical Power.
 
Topinio
Gerbil Jedi
Posts: 1641
Joined: Mon Jan 12, 2015 9:28 am
Location: London

Re: CPU vs GPU Bottlenecks

Tue Dec 11, 2018 6:28 pm

Ifalna wrote:
Well it is true for MMOs. Boh Final Fantasy XIV and World of Warcraft gobble up CPU power w/o any problem, esp when there are many people around.

Not sure about that, WoW uses only 15-26% of my CPU and 50-60% of my GPU according to Task Manager, but only gets 50-70 FPS, at max settings. (Total load is 30-40% on CPU and 50-60% on GPU.)

I don't understand this, even assuming I can near double those CPU numbers as Windows is counting threads not physical cores. It's not looking GPU- or CPU-limited.
Desktop: E3-1270 v5, X11SAT-F, 32GB, RX Vega 56, 500GB Crucial P1, 2TB Ultrastar, Xonar DGX, XL2730Z + G2420HDB
HTPC: i5-2500K, DH67GD, 6GB, RX 580, 250GB MX500, 1.5TB Barracuda
Laptop: MacBook6,1
 
JustAnEngineer
Gold subscriber
Gerbil God
Posts: 18402
Joined: Sat Jan 26, 2002 7:00 pm
Location: The Heart of Dixie

Re: CPU vs GPU Bottlenecks

Tue Dec 11, 2018 7:11 pm

I'd bet on WoW being very lightly threaded. It's loading one or two of the cores in your CPU and letting the others sit idle (or perhaps the single bottlenecking thread is bouncing around between cores).
i7-9700K, H100i v2, Z390M Pro4, 32 GiB, RX Vega64, Define Mini-C, SSR-850PX, C32HG70, RK-9000BR, MX518
 
Topinio
Gerbil Jedi
Posts: 1641
Joined: Mon Jan 12, 2015 9:28 am
Location: London

Re: CPU vs GPU Bottlenecks

Tue Dec 11, 2018 7:31 pm

It has about 30 threads, but only 1 or 2 serious ones, and those are on the same core/HWthread -- which is why I replied to Ifalna's '[w]ell it is true for MMOs [...] World of Warcraft gobble[s] up CPU power w/o any problem', because it doesn't.
Desktop: E3-1270 v5, X11SAT-F, 32GB, RX Vega 56, 500GB Crucial P1, 2TB Ultrastar, Xonar DGX, XL2730Z + G2420HDB
HTPC: i5-2500K, DH67GD, 6GB, RX 580, 250GB MX500, 1.5TB Barracuda
Laptop: MacBook6,1
 
dragontamer5788
Gerbil Team Leader
Posts: 200
Joined: Mon May 06, 2013 8:39 am

Re: CPU vs GPU Bottlenecks

Tue Dec 11, 2018 7:39 pm

Topinio wrote:
It has about 30 threads, but only 1 or 2 serious ones, and those are on the same core/HWthread -- which is why I replied to Ifalna's '[w]ell it is true for MMOs [...] World of Warcraft gobble[s] up CPU power w/o any problem', because it doesn't.


You aren't understanding the single-threaded bound problem then. CPU power is not only judged by its multithreaded performance, CPU power is also judged by its single-threaded performance. Its why people buy Intel when AMD offers far superior multithreaded performance at lower costs.

i9-9900k is considered the best CPU for gaming for a reason: 5GHz to maximize the performance of single-threaded code. More often than not, when games are CPU-bound, they're single-threaded bound.
 
synthtel2
Gold subscriber
Gerbil Elite
Posts: 866
Joined: Mon Nov 16, 2015 10:30 am

Re: CPU vs GPU Bottlenecks

Tue Dec 11, 2018 8:25 pm

Topinio wrote:
It has about 30 threads, but only 1 or 2 serious ones, and those are on the same core/HWthread -- which is why I replied to Ifalna's '[w]ell it is true for MMOs [...] World of Warcraft gobble[s] up CPU power w/o any problem', because it doesn't.

That core is probably spending very close to 100% of the time active.

dragontamer5788 wrote:
More often than not, when games are CPU-bound, they're single-threaded bound.

For TR readers, probably. The split is above 4C8T for a lot of games now, though, and CPUs down to 2C4T are still very common.
 
DeadOfKnight
Gold subscriber
Gerbil Elite
Topic Author
Posts: 718
Joined: Tue May 18, 2010 1:20 pm

Re: CPU vs GPU Bottlenecks

Tue Dec 11, 2018 9:30 pm

synthtel2 wrote:
For TR readers, probably. The split is above 4C8T for a lot of games now, though, and CPUs down to 2C4T are still very common.

Yeah, most older CPU-bound games are bound up on one core (WoW being a prime example), but there are a handful of new games that are now being bound by the number of cores/threads. It seems some devs have started targeting 8 threads specifically, as they don't tend to scale upwards from there. This has likely carried over from developing on console and the fact that there has been a decent number of PC users with 8 threads available. In such circumstances, 8 physical cores is better than 8 logical cores. Of course, they need to optimize their games for lower core counts, but some games lately have been released with some issues and/or bugs on 4 core CPUs.

Don't ask me for an example right now though, I don't want to give you the wrong title off the top of my head. There's plenty of CPU reviews out there talking about specific game issues if you care to look for it.
Intel Core i7-5775c, Asus Maximus VII Formula, Win 10 Pro
RTX 2080 Ti FE, Corsair 2x8GB DDR3-1866, Corsair AX860
Corsair H105, WD Red 4TB x2, Creative X-Fi Titanium HD
Samsung 850 EVO 1TB, Alienware AW3418DW, Corsair 450D
 
synthtel2
Gold subscriber
Gerbil Elite
Posts: 866
Joined: Mon Nov 16, 2015 10:30 am

Re: CPU vs GPU Bottlenecks

Tue Dec 11, 2018 10:17 pm

DeadOfKnight wrote:
Yeah, most older CPU-bound games are bound up on one core (WoW being a prime example), but there are a handful of new games that are now being bound by the number of cores/threads.

It's a lot more than a handful if you're on 2C4T. Even Skyrim is noticeably faster on 4C4T than 2C4T unless you want to run potato-level shadows, and pretty much everything released in the last 5 years is at least that threaded.

Who is online

Users browsing this forum: No registered users and 2 guests