Presumably, a jitter pattern alternating between five- and 15-millisecond frame times would be less of an annoyance than a 15- and 45-millisecond pattern. The worst example we saw in our testing alternated between roughly six and twenty milliseconds, but it didn't jump out at me as a problem during our original testing. Just now, I fired up Bad Company 2 on a pair of Radeon HD 6870s with the latest Catalyst 11.8 drivers. Fraps measures the same degree of jitter we saw initially, but try as I might, I can't see the problem. We may need to spend more time with (ugh) faster TN panels, rather than our prettier and slower IPS displays, in order to get a better feel for the stuttering issue.
In the end, despite the tone taken in the article, if you can't see the problem and you're the one researching it, it is a pretty safe bet most others won't see it either. It is a pretty big problem for those that do perceive it, but it's a lot like CRT flicker - Either you see it and it drives you batty, or you don't get what the crazy person is yelling about.
I've been trying to explain the general concept behind what's being said in the article for years as the reason you want more than 60fps despite the (incorrect) idea you can't see more than 60fps, and that monitors these days typically refresh at 60hz. 60fps where you're getting 2 frames per refresh is 30hz of gameplay.