Gray-Philosophy said:
CrystalShadow said:
. . .
This... Is not a very reliable test in some ways.
For a fast-moving object, shown at a 'framerate' higher than our eyes can percieve, motion blur will be automatic. (For the same reason that the analog image of a real-world object will be blurred)
Even if we can't tell two adjacent frames apart, our eyes will blur them together automatically, creating motion blur where there otherwise wasn't any.
. . .
This is not entirely accurate, as far as I have learnt at least.
The average human eye is able to register about the equivalent of 24 FPS of movement. Our brain then automatically blurs out the image to adjust for any "missing frames", this is true.
However, this only applies to physical objects since light has to travel from a light source, to the object, reflect off its surface and into our eyes, where it is then processed by our brain.
Animations on a screen works differently from looking at moving physical objects though, and is not subject to the 24 "FPS" limitation and motion blur. The reason we can tell the difference in FPS in games and such is because there is no actual motion going on, the images on the screen are produced by still light being projected straight at us rather than light reflecting off a moving physical object.
Similarly, if we were to look at a moving light source we would likely be able to "track its movement" at a higher "FPS" than 24, because we're processing projected light straight from the source, and not reflected light.
Detecting differences up to and beyond the 60 FPS mark probably has a lot to do with adaptation though, explaining why some just don't see the difference and some do. It is entirely possible though, even up to and beyond 120 FPS.
I'm saying this based on the results of experimenting with the program the OP linked, more so than anything else.
It was a simplification, not an attempt to be 100% accurate with every little detail.
But let's consider a few cases based on the actual science, if you like.
We have a limit to what we can consciously perceive, which most tests have determined to be a change of 1 frame in 30.
But our eyes don't really have a 'framerate', they operate continuously (there's other consequences of the chemical process involved, such as exposure to bright light can temporarily render individual receptors effectively blind, but that's not directly relevant.)
Where the 'real world' is involved, objects are in continuous motion, and giving off light the whole time.
If an object moves too fast for us to perceive clearly, since we never stop receiving light, the result is a blur over a large region.
When we look at old-fashioned film stock, this is being projected at 24 fps, but it takes physical time for the projector to advance the frame. Typically, what the projector does is blank the frame during the transition (with a physical barrier.)
This makes each frame a distinct thing with a blank (dark) period during the transition.
But since people can typically identify what's going on at such low framerates, there's no blur to it, and we may even be able to spot the 'blank' period (causing the image to flicker)
Newer film projectors play a little psychological trick here though. The film stock used is still only 24 fps, but rather than blank it on the transition, the frame is shown, blanked, then the same frame is shown again, before blanking a second time and transitioning to the next. This makes the 'blank' periods shorter, and created an effective framerate in some sense of 48 fps... But the film is still only 24 fps, so you can still pick up on individual frames. What it has done, is make the apparent flickering go away.
(A film of a real-world scene will probably contain motion blur on the film frames itself, owing to how a film camera functions, but that's another issue)
Still, 24 fps is too low to be saying anything about motion blur.
But to describe the effects of computer displays at high framerates requires considering what kind of technology the display uses.
Consider old-fashioned television broadcasts, and the old style CRT televisions for a moment.
These displayed either a 50 hz or 60 hz interlaced signal (depending on the TV standard used)
Does this cause the eye to create motion blur?
Well, here, the display itself confuses the issue greatly.
What a CRT display does is scan an electron beam across a screen of phosphors. The phosphor glows, and this glow takes a while to decay. Only one point is actually being scanned at any one moment, and the 'framerate' is simply a measure of how often the display can draw a complete frame. It is at pretty much no point however, NOT updating the display.
At any given point it almost always partway through drawing a frame, but the phosphors glow for long enough to make this imperceptible.
TV's were interlaced displays though, which means they scan one set of lines on one frame, and the alternate lines on the next. While this seems like it would leave gaps, in reality it blurs the two adjacent frames together, (typically the phosphors activated by the first frame are still glowing somewhat for the second.)
This innately blurs two adjacent frames together.
At no point is the display not giving off light, but of course, the image does transition in a way that doesn't resemble the real world. (The transition isn't instant, and happens at a single point scanning across the image)
For non-interlaced displays of course, this direct blurring of two frames doesn't happen, but phosphor decay rates still imply adjacent frames inherently blur together a bit.
But... Who still uses CRT displays, right?
Well, LCD's are a lot worse. 'Framerates' are an even more ambigious concept on an LCD, because it takes time for a pixel to transition from one state to another.
On top of that, the time taken varies (but not in a straight-forward linear manner) depending on what kind of transition it is... Going from pure white to pure black takes a different amount of time to going from an intermediate shade to another intermediate shade.
On old LCD's this was VERY obvious, leading to what was called a 'ghosting' effect, which bears some resemblance to motion blurring effects, but tended to be very irritating more so than anything else.
It's worth noting that LCD displays also scan the image one pixel at a time from the top-left, to bottom-right. This method of updating the display is what can create the tearing effect if a game is rendering frames faster than the display can show them. (The display will be part-way through drawing a frame, then suddenly the content of the frame buffer changes, and the rest of what's drawn is the next frame.)
There is actually no technical reason for an LCD display to update this way, it's merely a legacy of them being built on top of display systems designed for CRT monitors, but that's the way it works.
The main consequence here is if the 'framerate' being shown is very close to the limits of the LCD (which is most of the time, because why run a display at much less than what it's capable of), there will, again, be some blurring of adjacent frames due to the time it takes to transition any individual pixel.
The effect is somewhat different to a CRT display, but still has the approximate effect of blurring together adjacent frames, though the transition is typically slower, and thus smoother than on a CRT. (Not nessesarily a good thing)
The display technology clearly influences things here, but that's not quite all.
Looking at just the eye alone, there is indeed an obvious difference between an image on a display of a moving object, and an actual moving object. This is true whether you're talking about film, TV, computer games or anything else that is being shown on a screen somehow. (There is a difference between recordings of reality, which introduce blurring in the camera/recording equipment, and artificially created animation, eg games, CGI or even hand-drawn animation, where any such blurring has to be added deliberately)
Assuming computer generated images without motion blur, each frame will show a distinct, static image.
If a person perceives 30 frames a second, but the animation is at 60, there will be 2 frames drawn for each one a person is actually aware of. The overall effect of this is (roughly) that both frames are blurred together and seem to be one thing.
Is this different from reality? Absolutely. Reality is continuous, so there will be an image including all intermediate states between any two positions of a moving object. Functionally this is basically equivalent to having an infinite framerate.
With an artificial image, there will always be discrete, distinct positions even if you blur multiple frames together.
Whether this looks different to 'real' motion blur depends on how fast the object is moving, how large it is relative to the resolution of the display (If an object moves only 1 pixel in single frame, it'll make no real difference compared to seeing a real moving object).
Anyway, it's true, there is a difference. 'motion blur' of a computer generated image running at framerates higher than the eye can perceive will look like a bunch of distinct positions blurred together, while in the real world it'd look like one continuous smear of light across the whole range of motion.
Strictly speaking it's still motion blur either way. But of course, on the artificial image shown on a screen, it's not motion blur of the objects depicted onscreen, but rather 'motion blur' of the screen itself. (You are adding 'motion blur' to the image frames themselves, rather than the objects in the scene, which is what should be happening.)
It's different, yes, but nonetheless, the effect of exceeding the 'framerate' of the eye is to create a motion blur effect on moving objects. How convincing this looks is dependent on how fast the objects are moving, the resolution of the display, and the actual framerate of the display...
This is only perceptible as different from real-world motion blur if the object is moving too quickly. Otherwise it's near enough to the same effect to go unnoticed.
Still, all of this does show an interesting perceptual reason for why you can tell the difference looking at framerates over 30 fps even though you can't technically perceive what's going on...
Which was altogether too many words to say "you are technically correct, but within certain limits the distinction you're making doesn't matter"
...Why do I do this again? XD