Well, that's sort of informative. Too bad they don't show.... ANYTHING in that link other than just saying "up to xx percent!". What settingings were used? What framerate? Am I the only one that doesn't care about a 20% boost over 100 FPS as would be the case with Resident Evil 5? I have that (pathetic, terrible) game and it's the epitome of a console port that supposedly uses DX 10 for something, but it runs completely fluidly with all the settings maxed on a Phenom II @ 3.2 and a Radeon 4850, so any overclock is going to go completely unnoticed regardless of what game settings are used.
The point is that there are probably 5 games out there that will meaningfully
make use of more than a few cores. Don't you agree? Even in those hypothetical 5 cases, it's not so clear cut. Take Civ 5. Look here to see Intel's new quad core i3-2400 embarass AMD's fastest hexa core 1100T
And the Pentium EE 840 still puts up nearly acceptable frame rates. At realistic game settings, by the time your core count is your bottleneck your entire CPU is going to be the bottleneck and your GPU is going to have to be way outclassing your CPU
. There was a time when I was actually truly and meaningfully
CPU bottlenecked and it was with an x2-3600 @ 2.4 and a Radeon 4850 playing Crysis. With an Athlon II x3 at 3.7 you'd have to have a Radeon 6950 or better for the CPU to even begin to be a meaningful
Having said all that, it probably doesn't matter
Cuz, like I said, if your core count is holding you back then your whole CPU is going to be holding you back, so, 3 cores or 4 cores - pick one and just play some games!