Forgive me if this post seems a bit jaded. I have been an avid PC gamer since the Doom II days and became obsessed when OpenGL extensions were released for Quake. However, these days it seems there is little incentive to stick with it as consoles have almost caught up with PCs in terms of immersion, despite having no where near the memory bandwidth, floating point, pixel, texture or shader pushing power.
I don't believe living in a console port world is the sole culprit. Even though I did enjoy PC exclusives such as Crysis, Crysis Warhead and StarCraft II, there was nothing about them that seemed like they were too technologically advanced to be ported to Xbox360 or PS3. Yes, I do realize the textures are higher res and the polygon count are probably a bit beyond the reach of current gen consoles, but I'd wager it's probably not by much.
Look at it this way: I typically game at 720P. I see no reason to sacrifice high framerates for image quality when there is little to gain. Mostly, it's less jaggies, which can be fixed with antialiasing anyhow. I rarely see textures in even the most demanding PC games that get more detailed as resolution increases, rather they just get stretched.
My theory is until PC CPUs/GPUs can utilize raytracing efficiently or perhaps a raster/raytrace hybrid, PCs will have little headroom in immersion over consoles. Now, I'm not sure if we have capable hardware now or require something much more efficient, but I wish someone would try, damn it!
Main: Core i5 6600K - 16GB DDR4-2400 - Radeon R9 390 - 128GB V300 SSD - 1TB WD Blue 7200rpm HDD
HTPC: Core i7 870 - 8GB DDR3-1333 - Radeon R7 360 - 1.5TB Barracuda 7200rpm HDD