Man I made a smartass response to this whole thread because it was pretty elitist but now people are just spouting bad science and it's pissing me off so here's a real answer. MAN.
So the human eye DOES capture images of the world. Technically, each eye captures part of a stereographic image of the world, and our brain puts them together in a simulation of 3D space. We capture one such image after another, continuously. As our brain pieces these together, it also interpolates them. That means much of what you see is actually an estimation between frames. It's a bit like when you have a 120hz interpolating filter on your TV, only it doesn't look like shit because our brain also has keyframes.
For those of you who don't know, keyframes are a thing in animation where a whole actor- say a dude swinging a sword- is drawn at a critical point. These points are when an action begins, or when an object changes direction, any interruption to the smooth flow of action. When you watch interpolated TV it looks weird because every frame is in-betweened to the next with an additional frame and this forcibly smooths even irregular actions- the real world is jerky. But your brain doesn't do this; it recognizes important movement and pieces your vision together with proper emphasis on it. That is how good your brain is.
Now the capture rate of your eyes is not fixed. It's based on saccadic eye movements, or saccades. These are tiny, involuntary movements of the eye that happen whenever your brain becomes aware of something in the environment. Did a pretty lady or handsome man walk past? Snap to that shit. Oh! A bird! LOOK AT IT. these even happen when you try to look at a totally serene image like a painting. Your eyes are constantly readjusting to get a more robust set of angles for the stereograph your brain is building you.
On every saccade, your eyes capture an image. But your rate of saccadic motion varies. If you're bored or tired, you have fewer saccades. If you're pumped up on adrenaline and terrified for your life you have a bunch more than normal. Same deal if you're really happy or turned on or excited. That's why a married dude can, to his wife's dismay, remember what the girl he kissed the first time was wearing, but not what his wife wore the previous day. It's also why we remember tragedy so vividly it overshadows a normal "good" day- we literally take smoother, better video of it.
Here's where it gets weird.
There is no way to sync up saccades to real frame rates. Even if you could average out a person's saccades (it's about 35/second) those saccades would not match a monitor's refresh rate, because that is consistent and your eye might take 3 frames in 1 ms then be lazy for the next 4. For this reason a higher frame rate is always better because it lowers your chance of seeing the same frame twice. Remember that most of what you see is fuzzed in by your brain, but your brain can also iron out the timing problems caused by your saccades. So the more unique frames of an animation you see, the better your brain can make it look when it actually builds the image. This means that in theory you can see improvement from ANY increase in frame rate- 900fps will look better than 750 UNDER SOME CIRCUMSTANCES, absolutely. It's just science.
ON THE OTHER HAND, most of what we see IS fake. Your brain takes what's there and just guesses what happens between frames. That means that any frame rate close to or above the rate of your saccades will produce roughly comparable results. In other words, above about 40-45 fps, the same portion of a game that you see will be invented by your brain as will be at 200-300-9874 fps. Or reality for that matter.
So what does this mean for gaming?
It means gaming should have two goals. First to meet a minimum frame rate for realistic motion. Since your saccades happen on average in the mid 30s per second, that average should be exceeded for best results. So 40 and up is again, where we hit the "smooth enough for the brain" threshold. Goal 2 is that any fps above that will make motion look better and help your brain more accurately show you what you are seeing. There is NO CAP on this. ANY AMOUNT OF FPS will help, however minutely. But it matters way less than meeting that initial value.
Let's go back to animation.
Remember when I talked about keyframes? Well games have these too. In fact, in a modern game engine, the engine itself may do the in-betweens. And usually a keyframe is what we need to react to in a fast head-to-head game. If you are playing a shooter with hitscan bullets, then the moment a muzzle flashes is when a shot is out. IF A FRAME RATE IS TOO LOW to show the exact moment in an animation when an attack or other interaction happens, then you are at a disadvantage, no matter how small, against someone seeing the full event.
Again, it doesn't take FPS through the roof for this. In fact, in some games, physics and even character motion occur at a lower frame rate than the game can potentially run. In Crysis 1, for instance, physics were calculated up to 15fps but sometimes lower, while the frame rate of the game was totally uncapped. This means, if you were playing at 120fps (and jesus good luck playing crysis at 120fps even now) then a rolling barrel only checks its interaction with other objects every 4th frame.
So for those of you claiming that 30fps hurts your eyes? Well sadly, it's mostly in your mind. Science says so. But the harder you look the more you will notice the lower rate, because that's how eyes work. And for those of you saying that you can't notice fps above 30? Sorry, your eyes can, even if you don't think you do. And to the reaction time crowd, you're kind of right, assuming you have nerves like a steel trap anyway. But most importantly, to all the developers and publishers who think capping a game at 30fps makes it better looking? No, again the human eye does not work that way. A higher fps never hurts, only going below that minimum to satisfy the brain hurts. Stop lying to customers and uncap. You aren't providing any kind of feature.
So the human eye DOES capture images of the world. Technically, each eye captures part of a stereographic image of the world, and our brain puts them together in a simulation of 3D space. We capture one such image after another, continuously. As our brain pieces these together, it also interpolates them. That means much of what you see is actually an estimation between frames. It's a bit like when you have a 120hz interpolating filter on your TV, only it doesn't look like shit because our brain also has keyframes.
For those of you who don't know, keyframes are a thing in animation where a whole actor- say a dude swinging a sword- is drawn at a critical point. These points are when an action begins, or when an object changes direction, any interruption to the smooth flow of action. When you watch interpolated TV it looks weird because every frame is in-betweened to the next with an additional frame and this forcibly smooths even irregular actions- the real world is jerky. But your brain doesn't do this; it recognizes important movement and pieces your vision together with proper emphasis on it. That is how good your brain is.
Now the capture rate of your eyes is not fixed. It's based on saccadic eye movements, or saccades. These are tiny, involuntary movements of the eye that happen whenever your brain becomes aware of something in the environment. Did a pretty lady or handsome man walk past? Snap to that shit. Oh! A bird! LOOK AT IT. these even happen when you try to look at a totally serene image like a painting. Your eyes are constantly readjusting to get a more robust set of angles for the stereograph your brain is building you.
On every saccade, your eyes capture an image. But your rate of saccadic motion varies. If you're bored or tired, you have fewer saccades. If you're pumped up on adrenaline and terrified for your life you have a bunch more than normal. Same deal if you're really happy or turned on or excited. That's why a married dude can, to his wife's dismay, remember what the girl he kissed the first time was wearing, but not what his wife wore the previous day. It's also why we remember tragedy so vividly it overshadows a normal "good" day- we literally take smoother, better video of it.
Here's where it gets weird.
There is no way to sync up saccades to real frame rates. Even if you could average out a person's saccades (it's about 35/second) those saccades would not match a monitor's refresh rate, because that is consistent and your eye might take 3 frames in 1 ms then be lazy for the next 4. For this reason a higher frame rate is always better because it lowers your chance of seeing the same frame twice. Remember that most of what you see is fuzzed in by your brain, but your brain can also iron out the timing problems caused by your saccades. So the more unique frames of an animation you see, the better your brain can make it look when it actually builds the image. This means that in theory you can see improvement from ANY increase in frame rate- 900fps will look better than 750 UNDER SOME CIRCUMSTANCES, absolutely. It's just science.
ON THE OTHER HAND, most of what we see IS fake. Your brain takes what's there and just guesses what happens between frames. That means that any frame rate close to or above the rate of your saccades will produce roughly comparable results. In other words, above about 40-45 fps, the same portion of a game that you see will be invented by your brain as will be at 200-300-9874 fps. Or reality for that matter.
So what does this mean for gaming?
It means gaming should have two goals. First to meet a minimum frame rate for realistic motion. Since your saccades happen on average in the mid 30s per second, that average should be exceeded for best results. So 40 and up is again, where we hit the "smooth enough for the brain" threshold. Goal 2 is that any fps above that will make motion look better and help your brain more accurately show you what you are seeing. There is NO CAP on this. ANY AMOUNT OF FPS will help, however minutely. But it matters way less than meeting that initial value.
Let's go back to animation.
Remember when I talked about keyframes? Well games have these too. In fact, in a modern game engine, the engine itself may do the in-betweens. And usually a keyframe is what we need to react to in a fast head-to-head game. If you are playing a shooter with hitscan bullets, then the moment a muzzle flashes is when a shot is out. IF A FRAME RATE IS TOO LOW to show the exact moment in an animation when an attack or other interaction happens, then you are at a disadvantage, no matter how small, against someone seeing the full event.
Again, it doesn't take FPS through the roof for this. In fact, in some games, physics and even character motion occur at a lower frame rate than the game can potentially run. In Crysis 1, for instance, physics were calculated up to 15fps but sometimes lower, while the frame rate of the game was totally uncapped. This means, if you were playing at 120fps (and jesus good luck playing crysis at 120fps even now) then a rolling barrel only checks its interaction with other objects every 4th frame.
So for those of you claiming that 30fps hurts your eyes? Well sadly, it's mostly in your mind. Science says so. But the harder you look the more you will notice the lower rate, because that's how eyes work. And for those of you saying that you can't notice fps above 30? Sorry, your eyes can, even if you don't think you do. And to the reaction time crowd, you're kind of right, assuming you have nerves like a steel trap anyway. But most importantly, to all the developers and publishers who think capping a game at 30fps makes it better looking? No, again the human eye does not work that way. A higher fps never hurts, only going below that minimum to satisfy the brain hurts. Stop lying to customers and uncap. You aren't providing any kind of feature.