DoPo said:
CrystalShadow said:
When we look at old-fashioned film stock, this is being projected at 24 fps, but it takes physical time for the projector to advance the frame. Typically, what the projector does is blank the frame during the transition (with a physical barrier.)
This makes each frame a distinct thing with a blank (dark) period during the transition.
But since people can typically identify what's going on at such low framerates, there's no blur to it, and we may even be able to spot the 'blank' period (causing the image to flicker)
I'm not sure what you mean there - movies do have motion blur. It's...well, "built into" them, I suppose but each frame is not just a still photo of exactly one moment in time - when filmed at 24 FPS, you get 24 frames that altogether encompass a whole second. In other words, each frame captures a 1/24th of a second (or about 40 milliseconds) - all of what happened in that time. Therefore, if an object is moving, you would get its position throughout the whole exposure time thus capturing the motion blur. Everything in the movies has a motion blur because of that. So when you are shown the 24 frames the transition between them is much smoother to the human eye for it's closer to what real life looks like, i.e., not a collection of still images but continuous motion.
Yeah, that's not what I was referring to. I tried to explain that later on, but I'll restate it:
Motion blur on the film stock is a result of blurring captured by the camera (if CGI images were transferred to film stock there may not be any motion blur on the frames). However, at 24 frames per second, each frame is percieved by the eye as an individual thing.
At, say, 48 frames per second, this isn't the case. you lose the identifiable transition between any two frames.
This is much the same as motion blur, but happens at a different point in the overall process.
You're confusing what the camera captured, with what the eye does with the footage when it's played back. It's the destinction between the objects captured on film being blurred in the film footage itself, and the eye blurring together the content of two or more frames because the framerate is higher than what the eye can perceive clearly.
(At 24 fps, not only does the eye not appreciably blur together adjacent frames, but the transition period during which the projector shutter is closed may be long enough to appear as a dark frame, or at least, flickering in the image.)
Anyway, both of these effects are forms of motion blur, but they have different causes.
Playing back footage at 60 fps would cause blurring together of multiple frames.
The blur recorded on any particular frame on the other hand depends on the camera, it's shutter speed, the recording framerate, and many other factors, all of which can lead to more, or less blurring than would be expected if you were witnessing the actual events the camera was recording. (Because in that case, the motion blur is a function of the camera used, while actually being there makes it a function of the limits of the human eye.)
Anyway, not sure if that's any clearer, but I don't know how else to explain it really...