The Big Picture: Frame Rate

Kahani

New member
May 25, 2011
927
0
0
Wax cylinders, vinyl, tape, CD... at no point has a reproductive medium failed to catch on because people were so attached to the sound of its flaws that they refused to adopt the objectively better new technology. This isn't like 3D, where the effect isn't actually the same as that produced by our eyes, a higher framerate produces a reproduction that is closer to what we would see if we were actually there. It might look a bit odd to start with since we're not used to that (or we are but associate it with cheap soap opera), but I'll be very surprised if 24fps is more than a niche product like vinyl in 10 years.
 

ewhac

Digital Spellweaver
Legacy
Escapist +
Sep 2, 2009
575
0
21
San Francisco Peninsula
Country
USA
the antithesis said:
I don't know what the fps of most HD televisions is nor if that's even a factor.
This gets goofy-bananas fairly quickly. And this only refers to North American video standards.

Take an image. Slice it up into 525 lines. Number the lines 1 through 525. Scan the odd-numbered lines out in 1/60th of a second. Then scan out the even-numbered lines in the next 1/60th of a second. You now have one frame of video, which took 1/30th of a second to display. Hence, video field rate is 60 FPS, and video frame rate is 30 FPS. This alternating display of odd then even then odd then even lines is called interlacing.

Now, introduce color televistion. It turns out you need a little extra space on every video line to synchronize the color circuitry (colorburst), but you can't speed up the horizontal sweep to compensate. Result: You're now scanning frames at 59.94 FPS, and frames at 29.97 FPS. So that's where those weird numbers come from.

Meanwhile, computer displays were originally repurposed television sets, so they swept out imagery at 60 fields/30 frames per second. And it was blurry and jittery. Someone said, "Say, can't we display all 525 lines in one sweep instead of two? The machines are fast enough now..." And thus were born progressive displays, where all the lines of an image are swept out at once, rather than being broken up over a series of fields.

Then the computer guys said, "You know, 640 * 480 really isn't enough pixels. Can we get more lines? And more pixels per line?" And thus was born the multisync monitor, which would try to adapt to whatever horizontal and vertical sweep rates the computer was generating. Soon there was 800 * 600, 1024 * 768, 1152 * 864, 1280 * 1024, and beyond. And you could display them at 60 frames per second, 72 frames per second, and eventually 240 frames per second.

Then someone saw how much nicer the computer displays were looking compared to their broadcast television counterparts and said, "Can we get some of that?" And thus were born the first "hi-def" video standards. But they had to be quasi-compatible with all the very expensive equipment the studios had already paid for. Hence, "Standard Def" is the old interlaced color video standard of 59.94 fields/29.97 frames per second, called "480i" (the 'i' means "interlaced"). 480p is the same number of pixels as 480i, but they're swept out over a single 1/59.94th second frame, rather than two 1/59.94th second fields.

But the new hi-def sets were being designed by the same groups of people as had been making computer monitors all that time, and they said, "There's no reason this multisync tech can't work in a TV. If we're getting a signal that's 59.94 fields per second interlaced, we'll sync to that. If we're getting a signal that's 720p at exactly 60.00 frames per second, we'll sync to that, too. Hell, plug in your old computer; we'll display whatever that thing's kicking out. 1024 * 768 @ 72 FPS? No problem..."

So the "standard" digital video format today is, "Whatever the content creator exported it as," since all the new displays just adapt. If you're going to broadcast over the air, then you are constrained by the FCC to a limited, well-defined set of formats, which are 480i, 480p, 720p, and 1080i. (I don't know if there's an FCC-sanctioned 720i or 1080p.) OTOH, if you're just sharing H.264 files over the Internet, then it can be whatever you think the recipient's video player can handle.

And now all that is stuck in your brain, too.
 

TheSapphireKnight

I hate Dire Wolves...
Dec 4, 2008
692
0
0
I have not seen it 48 frames but I imagine that is where I would fall. I think it will require adjustments to the film making process in terms of effects and post production. The potential is there and I could easily see the issues being ironed out in the future.

I think it will be something people need to get used to it terms of both watching and producing.
 

Pinkamena

Stuck in a vortex of sexy horses
Jun 27, 2011
2,371
0
0
axlryder said:
The 48 fps looks like shit to me. It's NOT the same as games, btw. I tire of hearing people like "oh, well, games look better at 60fps, so what's the problem?!" Games are fully graphically rendered. Many of them can afford to be silky smooth without their seams showing. Not the same for film. Not only do the effects tend to look worse, but there's a different aesthetic mentality that we view films with. 24fps seems to lend itself better to this mentality in many cases. Maybe they'll learn how to circumvent the format's problems in the future, but for now it looks crap.

Also, "new" tech (which 48fps filming is not) doesn't necessarily mean better. I think a lot of movies would look better on film rather than being filmed digitally, despite digital being the newer tech. Hell, there's a reason why people laud Breaking Bad for being shot on 35mm as opposed to digitally. It looks good.
I had no idea that BB was filmed on actual film rolls. Interesting!
 

chozo_hybrid

What is a man? A miserable little pile of secrets.
Jul 15, 2009
3,479
14
43
leviadragon99 said:
Well for some utterly arbitrary and assinine reason I won't be able to see the movie until Boxing day anyway, because Australian cinemas are dumb like that.
Really?

I'm in New Zealand and we already have it, how could you not, you're the closest country to us.
 

cynicalsaint1

Salvation a la Mode
Apr 1, 2010
545
0
21
I saw it in 48 FPS and I'm honestly not sure how I feel about it.

To me it still gives it sort of a low quality "soap opera" look that I find distracting. The fact that I already associate that look to low quality material doesn't help.

So I feel like I can't really make a valid judgement on the tech until I get used to it enough that I don't automatically associate the 48FPS look with low quality material.
 

Griffolion

Elite Member
Aug 18, 2009
2,207
0
41
The funny thing was that the 48FPS thing didn't really affect me too much as I've been playing PC games at 60FPS for a long time now. Though the smoothness of the Hobbit was simply awesome. I'm going to find it hard going "back" to 24FPS films now without noticing the same way I found it hard not to notice simply how bad a "standard definition" picture looked compared to a HD one when I first encountered 1080p 6 years ago.
 

Hutzpah Chicken

New member
Mar 13, 2012
344
0
0
It's all Greek to me...
I don't understand how the speed of the movie projection has anything to do with the content.
 

ciancon

Waiting patiently.....
Nov 27, 2009
612
0
0
Ok, i have a question: Is the 48 FPS film available in 2D? I'd like to see the 48 FPS one but not in 3D. I don't want to sound finicky, it's just awkward cos I wear glasses and wearing two pairs kind of messes up the experience.
 

PunkRex

New member
Feb 19, 2010
2,533
0
0
We were given the option to draw 24FPS during my animation course... yeah no one did... Fps dat, am I right!? THANKYOU, tip your waiters.
 

Not G. Ivingname

New member
Nov 18, 2009
6,368
0
0
I may also point out that the "24" standard is far from the standard in some medium, most notably video games. The standard is actually 60 frames per second, the absolute minimum before people start thinking the game is unplayable is 30. In games movement and reaction rates are slowed if the buttons are hit in the spaces "between" the frames. This is why some Counter Strike tournaments are done on 1000 servers. No, I have no idea how they make it work.
 

Aardvaarkman

I am the one who eats ants!
Jul 14, 2011
1,262
0
0
It's kind of a stretch to even call this "new technology" - cameras have been capable of filming much higher than 48fps for decades. It's just that cinema projectors have been stuck on the 24fps "standard" for so long. Even then, it's not an issue of technology, but one of institutional momentum and laziness.
 

medv4380

The Crazy One
Feb 26, 2010
672
4
23
Interesting mention of the 48fps but it misses out on a lot of the historical details.

15fps was the old standard before 24fps, and it wasn't abandoned due to old hand cranks. It was given up because at 15fps you get odd optical effects like wagon wells moving backwards.

Back in the day a lot of testing was done, and their is plenty of literature on it, and the best frame rate to reduce optical effect and appear as motion is 24fps.

The problem is Computer Geeks have slipped into the Motion Picture filming arena with, and they are marketing 48fps.

For some reason they don't understand
24 Fluid Frames Per Second is what Film is at
Games on the other hand run at
60 Still Frames Per Second

Your eye actually has a very slow frame rate (~15fps for color much higher for gray scale), but it is taking fluid frames, and from them it extrapolates motion. Because games are still and have little or no accurate blur motion you have to do more frames to trick the eye into seeing a blur on the retina that it can interpret as motion.

48fps was marketed as a pipe dream. It was supposed to address the portion of the population that gets motion sick from 3D. If it did that then I might agree with the change. However, the reports of people still getting headaches from the 3D HFR version still exist. They assumed that the issue with the headaches was because of the motion blur, and ignored anything to the contrary. The real issue is with your eyes looking at an image stereoscopically in an unnatural fashion. It causes muscle strain and if your vision is even a little bad it builds up quicker, but even with good vision if you watched something in 3D all day you'd get a headache too.

The next problem is the 48fps is an attempt to get the frames to have less blur. Which makes a fast moving scene clearer, but will also look fake. The reason has nothing to do with the makeup or props. It's because your brain knows that something that is moving is supposed to have a lot of blur. Just move your fingers in front of your eyes if you don't believe me. If it doesn't see the blur it knows the image is fake. Some people say it's just like HD, and in a way they are right, but not for the reasons they think. The reason some people still say HD looks fake is because of foolish CG touch-up on scenes. Each time I see a scene in a film on my HD TV that looks fake it's because the background and the foreground are in focus. Your eyes cant focus on two different planes at the same time and when that happens your brain will know something is fake even if it can't pinpoint the exact issue. SD had an advantage of being just grainy enough that your brain wouldn't notice the background was also in focus with the foreground, and a Theater screen is so big that your eyes dart around enough your brain doesn't realize the image is in complete focus.
 

Pyrian

Hat Man
Legacy
Jul 8, 2011
1,399
8
13
San Diego, CA
Country
US
Gender
Male
In a few years, 48+FPS will look normal and 24FPS will look crappy and dated, like black&white, used for artistic effect and not much else.

I don't think slipping a frame past a viewer actually proves anything, if the viewer can still tell you the framerate is higher or lower. Modern screens it's harder to notice - they're actually much better at displaying 60FPS because they do not go black between frames like CRT's did. But CRT's, I could take a glance at across the room and tell the difference between 60FPS and 75FPS.
 

Aardvaarkman

I am the one who eats ants!
Jul 14, 2011
1,262
0
0
chozo_hybrid said:
leviadragon99 said:
Well for some utterly arbitrary and assinine reason I won't be able to see the movie until Boxing day anyway, because Australian cinemas are dumb like that.
Really?

I'm in New Zealand and we already have it, how could you not, you're the closest country to us.
Because the movie distribution industry is even more backwards than the film-making industry?
 

chozo_hybrid

What is a man? A miserable little pile of secrets.
Jul 15, 2009
3,479
14
43
Aardvaarkman said:
chozo_hybrid said:
leviadragon99 said:
Well for some utterly arbitrary and assinine reason I won't be able to see the movie until Boxing day anyway, because Australian cinemas are dumb like that.
Really?

I'm in New Zealand and we already have it, how could you not, you're the closest country to us.
Because the movie distribution industry is even more backwards than the film-making industry?
Damn, that sucks, I have friends in OZ looking forward to it.
 

Piorn

New member
Dec 26, 2007
1,097
0
0
Recently, I started to dislike general movie screen quality, and really appreciate the improvements.
I just like that I can actually make out stuff that moves fast across the screen now, as well as not getting dizzy when the camera moves, because it doesn't get so blurry and choppy.