Forgive me if this post seems a bit jaded. I have been an avid PC gamer since the Doom II days and became obsessed when OpenGL extensions were released for Quake. However, these days it seems there is little incentive to stick with it as consoles have almost caught up with PCs in terms of immersion, despite having no where near the memory bandwidth, floating point, pixel, texture or shader pushing power.
I don't believe living in a console port world is the sole culprit. Even though I did enjoy PC exclusives such as Crysis, Crysis Warhead and StarCraft II, there was nothing about them that seemed like they were too technologically advanced to be ported to Xbox360 or PS3. Yes, I do realize the textures are higher res and the polygon count are probably a bit beyond the reach of current gen consoles, but I'd wager it's probably not by much.
Look at it this way: I typically game at 720P. I see no reason to sacrifice high framerates for image quality when there is little to gain. Mostly, it's less jaggies, which can be fixed with antialiasing anyhow. I rarely see textures in even the most demanding PC games that get more detailed as resolution increases, rather they just get stretched.
My theory is until PC CPUs/GPUs can utilize raytracing efficiently or perhaps a raster/raytrace hybrid, PCs will have little headroom in immersion over consoles. Now, I'm not sure if we have capable hardware now or require something much more efficient, but I wish someone would try, damn it!
Ryzen 7 1800X at 3.9 - Corsair H60i - GA AB350 Gaming - 32GB DDR4 2666 at 14,14,14,34 - Gaming X GTX 1080 - 250GB WD Blue SSD - 2TB Toshiba 7200rpm HDD