It's a tricky thing to search up sometimes, and I haven't found all that much on the topic, but either way, it's worth discussing.
Personally, I try to keep my games at or above 60fps by whatever means necessary, though consistency is also as important. For example, if I want to crank up the settings, I will get drops to 20-30fps and when that happens, it's preferable to be able to cap the framerate at 30fps for a more consistent experience. Most console games that are designed to run at 30fps do just that, so that the game doesn't suddenly become more smooth and less smooth out of nowhere.
However, in most cases we're looking at 60fps as it makes it the easiest to aim and move in any game. The most common method to achieve this is vsync (usually double and triple buffered), but this of course, introduces a small amount of lag. The issue with input lag for me isn't so much being a "pro gamer" as it is simply wanting better aiming response as part of the gaming experience. (And given how badly I aim sometimes, any elimination of latency is helpful.) Because vsync introduces lag, it seems to me that it's more ideal to try to set up the settings to ensure 60+ fps, and then cap that framerate to 60 to ensure neither vertical tearing nor any latency at all, period. Plus, my experiences with triple buffering have been less than pleasant. I enabled it in Doom 3 (pretty sure it was on triple buffering) and last time I played it, it was unbearable. After disabling it, I was immediately able to make quick and precise aiming movements necessary to survive in there. I'm pretty sure the game itself capped to 60fps as well as I never really noticed any vertical tearing.
However, I am looking for a more universal framerate capping solution. Has anyone had luck with the fps limiter program? It seems restricted from DirectX7 to 9 and OpenGL, without anything for DirectX 10 and 11. (i.e. Battlefield 3)
In addition to this, I'm curious as to others' experiences regarding the input lag on a TN (via full resolution DVI) versus the non-existent input lag of a CRT. Oddly enough, I never noticed a difference between my Acer X223w and a Samsung CRT, and both monitors have been side by side before. If that particular monitor I have lags, it would probably be no more than one frame behind as it seems to do pretty well. I also have a Samsung LCD 1280x1024 with only a VGA input that also seems to have the same latency as the Acer screen. I've also even been unable to notice a difference on my Xbox from plugging it into a CRT versus plugging it into the Acer screen's VGA input. Perhaps the only way to go is to simply test it.
When switching to the TV though (a Sony KDL55EX500), I get significant lag even on gaming mode, HDMI/DVI, and 1920x1080 resolution. It seems to be two frames behind (an earlier test I did with a CRT seem to show roughly 30ms of lag), but it has been enough of a problem for me to affect my aiming on the fps's I've played on there. (For both Xbox and PC.)
What I guess I'm asking here is for fellow gerbils' experiences with TN's versus CRT input lag. Maybe the almost non-existent input lag is limited to certain TN's; I've noticed a bit of lag on a friend's ASUS monitor which has DVI, VGA, and HDMI inputs. He says he doesn't care which is odd considering he spent money on a 4000 DPI gaming mouse. I'm also curious on (as mentioned before) others' experiences with framerate capping, if anyone does that.
I suppose that in addition, I'm curious on input lag between IPS and TN panels. It seems that all or most VA based panels are prone to lag, or at least the large VA's/PVA's used in HDTV's, and that they aren't particularly good for fast moving, precision games. I'll have to do some searching on plasma TV's to see how they lag, but it's too late for me to regret that EX500 purchase decision I made for my family last year. (Plus that TV's excellent for video/movies despite having an itty bit of backlight bleed through the blacks.)