Obvious troll is obvious.
No decade-old desktop PC is adequate for modern games.
Yes, it is unless you are a die-hard videophile or need an insane framerate a.k.a tiny, but vocal minority. The masses and most current gamers play their content whatever the auto-settings chooses for them. The difference between "Ultra" and "High" in most games is subtle at best and "medium" detail isn't really that bad. It is only "low" where loss of graphical fidelity becomes painfully apparent. As long as the framerate sustains ~50-60FPS, most gamers are content with it.
The reason is because gaming consoles have been dictating the baseline for nearly everything out there and there's no economic incentive to create a PC exclusive that pushes the envelope hard. Crysis 1 was the last serious attempt at pushing the evelope and it had mediocre market performance.
A sizeable number of PC enthusiast have already given up chasing the constant upgrading wheel. The returns aren't what they are used to be unless you use your system for real-world work.
FYI, my current 9700K is only sightly faster then my previous 3570K rig at gaming stuff. 9700K only shows it prowess when you throw real-world workloads at it. I suppose that the advent of the PS5 and Next-Generation Xbox will finally raise the bar high enough that quad-core rigs might start becoming inadequate.