MikailCaboose said:
Zhukov said:
Can the untrained human eye even tell the difference between 30fps and 60fps?
Not really. In fact, what becomes noticeable is if the FPS doesn't remain constant. Then, the eyes can be drawn to that fact. But a stable 30 FPS is little different from 60 FPS as far as the human eye is concerned.
I can tell the difference quite readily (I actually estimated my own TV's frame-delay/stutter to within 20 milliseconds), but I've been working with computers and rendering systems for a long long time now.
It's more accurate to say that there is a range of "real life frame rates" because our brains employ "real world frame skipping" (we can't perceive or even think at the speed of visible light). Everything would appear blurry if our brain didn't skip all of that.
My father won a large, fancy TV last summer that had 120hz (translating into 120fps potential framerate). Wanting to test how much better/worse a quality TV is, I played a Blu-Ray of Iron Man 2, and then played the same scene with a regular DVD. The Blu-Ray showed a SIGNIFICANT increase in frame-rate and general smoothness quality compared to a regular DVD.
(even though standard theatrical cuts only play at 25fps, digital cuts taken from the Master Recording can have much higher frame rates; up to what the original camera recorded them at).