As PC gamers, we know that frame rates over 30 FPS make animated content look more true-to-life (which, when CG imagery is involved, can sometimes conjure up the dreaded uncanny valley, as with the Hobbit films). But what makes our eyes and our brains so sensitive to frame-rate differences?
Simon Cooke, a developer from Microsoft's Xbox Advanced Technology Group, thinks he's cracked it. Writing on his personal blog, Cooke attributes the effect to the way our eyes constantly jitter to capture extra information. These ocular microtremors, he says, happen at "roughly 83.68Hz (on average, for most people)." Cooke goes on to explain:
[I]f we accept that an oscillation of 83.68Hz allows us to perceive double the resolution, what happens if you show someone pictures that vary (like a movie, or a videogame) at less than half the rate of the oscillation?
We're no longer receiving a signal that changes fast enough to allow the super-sampling operation to happen. So we're throwing away a lot of perceived-motion data, and a lot of detail as well.
If it's updating higher than half the rate of oscillation? As the eye wobbles around, it'll sample more details, and can use that information to build up a better picture of the world. Even better if we've got a bit of film-grain noise in there (preferably via temporal anti-aliasing) to fill in the gaps.
It just so happens that half of 83.68Hz is about 41Hz. So if you're going to have high-resolution pulled properly out of an image, that image needs to be noisy (like film-grain) and update at > 41Hz. Like, say, The Hobbit. Or any twitch-shooter.
Less than that? Say, 24fps? Or 30fps for a game? You're below the limit. Your eye will sample the same image twice, and won't be able to pull out any extra spatial information from the oscillation. Everything will appear a little dreamier, and lower resolution. (Or at least, you'll be limited to the resolution of the media that is displaying the image, rather than some theoretical stochastic limit).
The full blog post goes into much more detail—some of which is, frankly, a little over my head. Cooke's hypothesis is definitely an interesting one, though. In his conclusion, Cooke recommends that game developers aim for a frame rate of at least 43 FPS or so, with "temporal antialiasing, jitter or noise/film grain to mask over things and allow for more detail extraction."