Jonathan Hornsby said:
As I said in a previous post in this thread; you've trained yourself to see it. By default a person's senses aren't that...fine tuned. Not to lavish you with undo praise, but the comparison is somewhat like basing the average person's running speed solely on Olympic gold medalists. Simple fact is that there is a range for this, a certain tolerance for error. No two people are exactly alike, but there is a minimum and maximum sensitively, and those of us on the low end of the spectrum who haven't spent years obsessing over pixels are already reaching our limit. Yours is a bit higher, good for you, won't be too long before your limit is reached too. The actual average gamer is already about at their limit.
Here
http://amo.net/NT/02-21-01FPS.html
http://www.100fps.com/how_many_frames_can_humans_see.htm
http://www.cameratechnica.com/2011/11/21/what-is-the-highest-frame-rate-the-human-eye-can-perceive/
Proof that you are wrong. It only took me a minute on google. ^^
In the last one it's even said that you'll notice FPS even more on a bigger screen.
24~30 are fine to make something look continuous and not like a slide slow but everything above 30 does add to the experience. Sure the Hobbit has been brought up several time but the main reason for that is because people are used to 24 for their movies.
For games more is better and PC gamers have been getting 60 Fps for years now and while this Console Gen should have been able to do 1080/60 they can barely do 1080/30 but I would bet that if there was a console gen that did run 60fps for the entire duration everybody would get pissed if they changed it back to 30.
Also, the hypocrisy of some as when the consoles were announced they sang praises for 60fps but now that it's not achievable it suddenly doesn't matter, just like resolution.