The Big Picture: Frame Rate

RubyT

New member
Sep 3, 2009
372
0
0
T3hSource said:
The difference between 30 and 60 FPS and even 120 FPS is quite noticeable.
Is it?

Often games can drop from 60+ frames in unchallenging areas to just about 20 frames when the heat is up. Other than visual stuttering that occurs under 20 fps (under 30 fps for very fast movements), what difference should I notice?
 

Marendithias

New member
Sep 23, 2009
14
0
0
I saw it in 48 fps and I loved the higher frame rate. Finally, I can actually see what is going on in those fast paced action scenes and the overall picture quality looked absolutely beautiful. Better than any film I ever remember seeing. What's more is that the 3D felt so much more natural than it did in other films I have seen. I only wish all the film purists weren't so tied to the "feel" of 24 fps. It's just like the old "black and white" vs "colour" argument or "talkies" vs "silent film". 48 fps is better in every way. We just aren't used to it yet.

I will admit that it took me about 15 to 30 min for my mind to adjust to the new frame rate during which time things seemed to move "in a strange way". Once I got used to it, it was amazing. I can't wait to see movies like Avatar 2 in 60 fps.
 

Nurb

Cynical bastard
Dec 9, 2008
3,078
0
0
Yvressian said:
Nurb said:
People don't put up with games less than 30 FPS in games, why the fuss over higher framerate in film? You'd think removing the effect of the eye's inability to see clearly with fast movement would be a good thing
Everyone keeps comparing the 48 frame film to the frame rates in video games, but it's apples and oranges.
A high frame rate in games produces a more natural look and motion. In movies, however, a 24fps version already shows a perfectly natural motion, and adding frames just offsets it.
The movie looks slightly like Benny Hill in the 48FPS version. Believe me, it's distracting.

Image clarity should be achieved with higher resolution photography and cameras, not fiddling with the frame rate. That way, you could get a clearer picture, but you wouldn't unintentionally mess up motion.
It's so completely the same thing; frames per second as measurement for speed of digital refresh rate, it doesn't matter if it's gaming or movies, the only difference is that you and a lot of people are just used to 24fps movies. It mimics the motion blur of the eye, which is a flaw to make up for our brain's processing of detail, but when we're watchin a movie it doesn't need to be there at all. I like the smooth level of detail it gives, and I've been aware of higher frame rates in blu-rays on the right screen and thought it looked "different" but as an improvement. I'm the person that always disables motion blur as in games too lol, most people don't want to play a game that needs precision and have pretend eye weaknesses muddling things up.

It'll take a while but everyone will get used to it.
 

Catrixa

New member
May 21, 2011
209
0
0
JenSeven said:
Sheesh, what a bunch of total idiots.
Why not check out what we Europeans have to deal with? 29.97 FPS, with frame interlacing to cover the difference.
This is basically taking two frames and putting a "mixed" one in-between them and with mixed I mean cut like window blinds and put semi-transparent over each other.
When paused or put in slow motion you can clearly see it, however my eyes also seem to pick it up normally and it's a completely horrible and terrible idea. It makes a movie unwatchable for me.

So before people start bitching about dumb things like 48 FPS, just try and think of worse ideas, because there are plenty of them
I don't mean to pry (but technically I am, sorry), but where do you live in Europe? I honestly thought all countries (and so does Wikipedia, apparently) were either PAL (25fps) or SECAM. And interlacing is everywhere for SD formats, as well as most 1080 you see on TV (I'm not sure if anyone broadcasts in 1080p, honestly, but I live in the US and am certifiably dumb). It was originally invented for cathode ray tube TV's, but I imagine if you own a TV that doesn't use a very good deinterlacing formula (or not good enough for your eyes; everyone's different), you'd see artifacts from that (if there was no deinterlacing, you'd see teeth on the edge of the video, and it looks horrible).

Anyway...

So, what the hell. Why on EARTH didn't they do this sooner? And, really, why 48fps? I thought the whole dang reason we wanted to keep 24fps around for so long was to keep people who learned how to pan a camera in 24fps from having obsolete degrees (although, this was hearsay, and I'd love to know why if this wasn't it). Really, if we're going to change the damn framerate here, why can't it be 29.97 or 59.94? Why do we have to keep TV and movies on separate framerates (or, while we're at it, not just switch to 30 and 60 fps. No one needs to worry about that silly frame anymore, no need to drop it)? Really? Because here's how I see it going:

Right now:
Take 24fps movie. Apply 3:2 pull down (ugly way to get more frames, keeping the movie at the same pace). Win

After 48fps becomes a thing:
Take 48fps movie. Remove half the frames. Apply 3:2 pull down. Make copious quantities of lag from having to math this real time. Make poor people who have to encode this silly conversion cry.

Seriously, if we're going to the trouble to negate all of those 24fps degrees, can't we just consolidate fps so it makes sense? I'd posit merging PAL and NTSC somehow, but I know that's asking too much. Ugh. At the end of the day, all of these silly video conventions are from before digital video and are dumb. But, as the old saying goes, why fix what isn't broke? Even if it makes absolutely no sense whatsoever anymore?
 

sinterklaas

New member
Dec 6, 2010
210
0
0
Saw 3D with 48 FPS yesterday.

I have never seen such a beautiful thing in my life before. I have no idea what people are seeing, but what I saw was a realistic looking, clear and extremely beautiful movie.

I normally go to the cinema less than one time a year, that might have something to do with it. You guys are all used to the normal 'shitty' movie format and that's why you believe it looks fake or weird.
 

DanDeFool

Elite Member
Aug 19, 2009
1,891
0
41
As far as the critic response is concerned, I figure this complaint will go away as the technique becomes more widespread. For example, if you took a 128 kbps MP3 and played it on a high-end piece of stereo equipment, you might think it doesn't sound right because you've spent so much time listening to that track on your crappy ear buds. That the higher fidelity does a lot to accentuate the flaws in the encoding doesn't make hi-fi playback bad, but it would take you some time to get used to just how different those tracks sound.

In short, you've just got to wait until you get used to it.
 

Roxor

New member
Nov 4, 2010
747
0
0
If you're going to change the frame-rate, why only 48fps? I can see the logic in keeping to a multiple of 24, given that was the past standard and it would make converting old material easier, but why not make the new standard significantly higher? Say, 192fps. Would go nicely with the 192kHz sample-rate for the audio.

Also, if the movie industry wants to say they're so much better than the TV or video game industry, shouldn't they put their money where their mouth is and actually give themselves a technological leg-up over their competition?

192 fps, 8192*4608 pixels, 384kHz, 26.8 channel audio. Better than TV! Better than games! Available only in theatres!
 

invadergir

New member
May 29, 2008
88
0
0
To moviebob. People aren't stupid. We get frame-rate and speed.

The Hobbit got criticism because it meandered about and wants to turn a 300 page book into 3 movies and shoe-horned it's future movies by including other novel material.
 

JayRPG

New member
Oct 25, 2012
585
0
0
the antithesis said:
I don't know what the fps of most HD televisions is nor if that's even a factor. What I can say is the clarity of the image, especially when there's movement, is off putting. Anyone who says this is more like real life is going to hell for lying. The sharper image of, say, a tennis match looks less real than the old non HD image.

I think the problem may be there is a bit of an uncanny valley effect going on. As the filmmaking technology approaches when the human eye sees in real life, the difference become more glaring and off-putting. I'm not even talking about make-up, sets, and props effects not being up to snuff in the better image. I was watching a tennis match on my parent's HD television and it didn't look like real life and the effect of all the movement, watching the ball and such was off-putting.

So when I eventually see the Hobbit, it will be in a non-3D 24 fps theater. I don't need to pay amovie ticket prices to have a bad experience.
I just have absolutely no way to relate to that, I don't see how you find clarity during movement off-putting, it is certainly less real looking when it comes to most sports but I definately prefer being able to see individual blades of grass on the field when I'm watching football, I can actually see where the footy is going for a start... it was all guess work on a CRT.
 

Deacon Cole

New member
Jan 10, 2009
1,365
0
0
Country
USA
Whatislove said:
I just have absolutely no way to relate to that, I don't see how you find clarity during movement off-putting, ...
The Red Letter Media guys said it looks like video, and that's an apt description, I think, if you've ever seen the picture quality of video from the early-to-mid 80's. It kind of looks like that. Maybe with some time I could find other things it looks like, but what I would not say it looks like is real life. For all that technology and expense, it's not an improvement. That's all I can really say. It would be one thing if I found it like real life and that was something I had to adjust to, but that isn't the case. It's not like real life. It's not even closer to real life. It looks just as fake but in a different way. That's not an improvement.
 

dthree

Hey!
Jun 13, 2008
165
0
0
Its all about the motion blur. Everyone is used to the motion blur of 24fps, so 48 is going to be an adjustment. And to the guy complaining about 24fps films using a 3:2 cadence in 29.97, we're not talking about films shown on tv, we're talking about films in the theaters. We have that same problem here in the US, and 3:2 cadence sucks so bad that many of the better TVs adapt for it and reverse the cadence back to pure 24fps. Might be called inverse or reverse telecine, or pulldown depending on the product.
 

Skaven252

Regular Member
Apr 7, 2010
40
0
11
There's yet another challenge to overcome: rolling shutter. Even these super duper expensive Red Epic film cameras have CMOS sensors which are not read all at once, but scanned top to bottom. Most of the time it doesn't seem like a problem, but it can be quite jarring at times.