We get some interesting questions and article suggestions via email from time to time, and we just don't have time to address them all with a proper article, unfortunately. I received one such message recently that asks a burning question we've seen posed in many ways over the years. Let me just reprint this reader's question for you:
I'm just throwing a suggestion for an article if you ever get bored/run out of ideas. I always hear about older CPU's bottlenecking newer video cards, so it would be cool to actually see this tested. Especially since I always here people say "at Athlon X2 will heaviy bottleneck a HD 48xx series GPU..." and lines along those statements. However I don't think I've ever seen anyone substantiate those claims with actual data. I can't distinguish BS answers from factual answers, since everyone seems to have their own opinions and views. I think it would be great to see an article investigating this. Of course I understand you probably are quite busy a lot of the time, but I figured its worth a shot in providing a suggestion or giving you an idea.
This topic never goes away, but it is a rather difficult question to answer definitively, because it's endlessly complex (or something close to it). Here's my attempt at a quick answer, which I figured some folks might find interesting.
Yeah, that is an interesting question. Complicated, too. Much depends on the workload you're using, both for the CPU and the GPU. We haven't focused an article on just this question, but we have looked at performance scaling in various ways.
Here's an example with one GPU and multiple CPUs at different display resolutions:
And here's another with multiple GPUs on one fast CPU at different resolutions:
The reality is that you need to have the right balance of CPU and GPU power for the display settings and resolution chosen in a particular game. But, as the first graph there shows, all of the CPUs we tested will average nearly 60 FPS in Far Cry 2, so the GPU is the primary bottleneck.
You have to drop down to the very slowest PC processors in order for the CPU to become any kind of bottleneck in a recent game, especially if it's a console port, since console CPUs are dreadfully slow. Even the Pentium E6300 can sustain 30+ FPS in Far Cry 2:
Of course, this whole equation will change with a different game or different visual quality settings (or switching from DX10 to DX9) in this same game. But generally speaking, these days, even a $90 Athlon dual-core is likely to run most games well, with the possible exception of more complex PC-native RTS games and the Great Exceptions, Crysis and Crysis Warhead. Note the frames rates consistently in excess of 120 FPS for Left for Dead 2 and Wolfenstein in our Lynnfield review, for instance. I do advise gamers to avoid quad-core CPUs with really low clock speeds. A higher-frequency dual-core is a better bet when the going gets rough.
I'm not sure we can dedicate an article to this issue soon—this is a very tough question to answer definitively—but we do try to provide the information you need in our reviews to make a smart buying decision. I hope this helps a little!
|Steam's 2017 Summer Sale is downright hot||24|
|Asus XG-C100C NIC breaks the gigabit barrier||23|
|Stuff a terabyte of RAM in Gigabyte's MZ31-AR0 Epyc motherboard||33|
|National HVAC Tech/Onion Ring Day Shortbread||18|
|Imagination Technologies hangs a "for sale" sign in its window||30|
|Vulkan is about to erupt in CryEngine 5.4||3|
|Mionix's new RGB LED keyboard lights the Wei forward||5|
|ThinkPad lineup will get a retro model for its 25th anniversary||26|
|Netgear readies the Nighthawk X6S for take-off||23|
|As a Postdoc I know most people do not share my strong feelings towards data presentation. But non zero rooted axis should almost never be used. (log...||+22|