[Update 2] How/why are console gamers satisfied with 30 fps?

Rayce Archer

New member
Jun 26, 2014
384
0
0
Man I made a smartass response to this whole thread because it was pretty elitist but now people are just spouting bad science and it's pissing me off so here's a real answer. MAN.

So the human eye DOES capture images of the world. Technically, each eye captures part of a stereographic image of the world, and our brain puts them together in a simulation of 3D space. We capture one such image after another, continuously. As our brain pieces these together, it also interpolates them. That means much of what you see is actually an estimation between frames. It's a bit like when you have a 120hz interpolating filter on your TV, only it doesn't look like shit because our brain also has keyframes.

For those of you who don't know, keyframes are a thing in animation where a whole actor- say a dude swinging a sword- is drawn at a critical point. These points are when an action begins, or when an object changes direction, any interruption to the smooth flow of action. When you watch interpolated TV it looks weird because every frame is in-betweened to the next with an additional frame and this forcibly smooths even irregular actions- the real world is jerky. But your brain doesn't do this; it recognizes important movement and pieces your vision together with proper emphasis on it. That is how good your brain is.

Now the capture rate of your eyes is not fixed. It's based on saccadic eye movements, or saccades. These are tiny, involuntary movements of the eye that happen whenever your brain becomes aware of something in the environment. Did a pretty lady or handsome man walk past? Snap to that shit. Oh! A bird! LOOK AT IT. these even happen when you try to look at a totally serene image like a painting. Your eyes are constantly readjusting to get a more robust set of angles for the stereograph your brain is building you.

On every saccade, your eyes capture an image. But your rate of saccadic motion varies. If you're bored or tired, you have fewer saccades. If you're pumped up on adrenaline and terrified for your life you have a bunch more than normal. Same deal if you're really happy or turned on or excited. That's why a married dude can, to his wife's dismay, remember what the girl he kissed the first time was wearing, but not what his wife wore the previous day. It's also why we remember tragedy so vividly it overshadows a normal "good" day- we literally take smoother, better video of it.

Here's where it gets weird.

There is no way to sync up saccades to real frame rates. Even if you could average out a person's saccades (it's about 35/second) those saccades would not match a monitor's refresh rate, because that is consistent and your eye might take 3 frames in 1 ms then be lazy for the next 4. For this reason a higher frame rate is always better because it lowers your chance of seeing the same frame twice. Remember that most of what you see is fuzzed in by your brain, but your brain can also iron out the timing problems caused by your saccades. So the more unique frames of an animation you see, the better your brain can make it look when it actually builds the image. This means that in theory you can see improvement from ANY increase in frame rate- 900fps will look better than 750 UNDER SOME CIRCUMSTANCES, absolutely. It's just science.

ON THE OTHER HAND, most of what we see IS fake. Your brain takes what's there and just guesses what happens between frames. That means that any frame rate close to or above the rate of your saccades will produce roughly comparable results. In other words, above about 40-45 fps, the same portion of a game that you see will be invented by your brain as will be at 200-300-9874 fps. Or reality for that matter.

So what does this mean for gaming?

It means gaming should have two goals. First to meet a minimum frame rate for realistic motion. Since your saccades happen on average in the mid 30s per second, that average should be exceeded for best results. So 40 and up is again, where we hit the "smooth enough for the brain" threshold. Goal 2 is that any fps above that will make motion look better and help your brain more accurately show you what you are seeing. There is NO CAP on this. ANY AMOUNT OF FPS will help, however minutely. But it matters way less than meeting that initial value.

Let's go back to animation.

Remember when I talked about keyframes? Well games have these too. In fact, in a modern game engine, the engine itself may do the in-betweens. And usually a keyframe is what we need to react to in a fast head-to-head game. If you are playing a shooter with hitscan bullets, then the moment a muzzle flashes is when a shot is out. IF A FRAME RATE IS TOO LOW to show the exact moment in an animation when an attack or other interaction happens, then you are at a disadvantage, no matter how small, against someone seeing the full event.

Again, it doesn't take FPS through the roof for this. In fact, in some games, physics and even character motion occur at a lower frame rate than the game can potentially run. In Crysis 1, for instance, physics were calculated up to 15fps but sometimes lower, while the frame rate of the game was totally uncapped. This means, if you were playing at 120fps (and jesus good luck playing crysis at 120fps even now) then a rolling barrel only checks its interaction with other objects every 4th frame.

So for those of you claiming that 30fps hurts your eyes? Well sadly, it's mostly in your mind. Science says so. But the harder you look the more you will notice the lower rate, because that's how eyes work. And for those of you saying that you can't notice fps above 30? Sorry, your eyes can, even if you don't think you do. And to the reaction time crowd, you're kind of right, assuming you have nerves like a steel trap anyway. But most importantly, to all the developers and publishers who think capping a game at 30fps makes it better looking? No, again the human eye does not work that way. A higher fps never hurts, only going below that minimum to satisfy the brain hurts. Stop lying to customers and uncap. You aren't providing any kind of feature.
 

EXos

New member
Nov 24, 2009
168
0
0
Jonathan Hornsby said:
There is no point in discussing evidence with a group of people dead set on dismissing the evidence of their own life experiences, and the claims of numerous others because their experiences apparently count for nothing without some officially sanctioned scientific study.
Evidence of "life experiences" are subjective. I mean I can see the difference between 30Fps/60Fps but I wouldn't imply that everybody sees it just as I do.
Instead I go to objective studies that prove how and why.
And if you're going to assert a function of the human body because you 'think' that's how it work you're going to have a lot of disappointment in life when you grow up.

If you want to reject the lessons of real life in favor of text that's your business, I won't be wasting my time on someone so narrow-minded.
Real life, as in enjoying a nice cold drink on a hot day while lying in the sun? I don't need a piece of paper to tell me that.
Explaining what heat is and how it is generated by the sun. Yeah I would like to see it in an official study instead of some whiny dumb dumb that says its the tail of a unicorn.

Cycling through the woods and enjoying the cool wind on my face? I don't need a piece of paper to tell me I enjoy it.
Explaining how the bicycle stays upright or where the wind comes from? Yeah I want to know it in a scientific way. Not by Jonathan Hornsby that says that wind is the same unicorn farting.

Enjoying a game at 30/60fps? I don't need a scientific explanation for that.
How the eye sees Frames per Second on a screen and what the limit is? I want it on a piece of paper explaining it in all the wonders of the biochemical processes that make up the human eyeball.
Not by Jonathan Hornsby that keeps changing his mind and dodging question like a insecure nut that thinks he has all the answers to life's questions, strangely all involving unicorns... <.<
 

DoPo

"You're not cleared for that."
Jan 30, 2012
8,665
0
0
Jonathan Hornsby said:
There is no point in discussing evidence with a group of people dead set on dismissing the evidence of their own life experiences, and the claims of numerous others because their experiences apparently count for nothing without some officially sanctioned scientific study.
Says the person who dismisses other people's evidences and experiences because without his own approval they apparently count for nothing along with objective scientific facts.

Jonathan Hornsby said:
If you want to reject the lessons of real life in favor of text that's your business, I won't be wasting my time on someone so narrow-minded.
Says the person who refuses to believe objective scientific facts and instead keeps claiming his own word is the only one valid.

You're the fucking gift that keeps on giving!
Jonathan Hornsby said:
DoPo said:
Jonathan Hornsby said:
I weep for a generation so ashamed of its own intelligence that it refuses to even think for itself.
Says the person who thinks being perceptive is some sort of rare superpower, and when proven wrong, claims anybody saying the opposite must be not as smart as him or they would have seen that the objective proof is not needed when you have bullshit on your side and stick to it.

Thank you for providing my entertainment tonight.
You're welcome. I never said anything to that effect, but it is clearly pointless to debate someone who has already made of their mind.
You did or did not compare being able to differentiate framerates above 30 with Olympic level of dedication and effort? And with bench pressing over a ton? And outright said it takes years of training? Also did you or did you not say that people who trust not only their opinion but actual proof, lack common sense? And that you yourself possess it because somehow not listening to what anybody else tells you proves it? I'll go ahead and answer "yes" to all of those for you, because you did.
 

Machocruz

New member
Aug 6, 2010
88
0
0
The "30fps is more cinematic" argument is the new "They could only do turn-based back then because of technological limitations." Both are situations where even basic knowledge of the medium and 5 seconds of critical thought show these up for the laughably erroneous claims they are.

In the context of video games, the cinematic look of a game is not dependent on framerate. Since game visuals are built from the ground up, out of 'nothing,' they can be made to look cinematic at any frame-rate by the programmers and artists. There is no shutter or aperture that make their imprint on the image, as there is in film. Games have no such limitations. Game makers can manufacture the effects of motion blur, depth-of-field, lighting conditions, color grading, etc. that are hallmarks of a "cinematic" look

I also find the PC vs. Console aspect of this to be strange. Yes, PC games more commonly run at higher fps, but being a melee action fan, I have played plenty of console games that run at 60. This is why I still prefer those out of Japan, as they are more likely to be designed with the higher frame rate possible during the era of release, and are faster, snappier, twitchier than those from the west. The western equivalent were FPS built on the Quake engines and those with similar movement speed.
 

waj9876

New member
Jan 14, 2012
600
0
0
Because then we'd have to buy new computers every couple of years.

Or I guess just build our own and upgrade that as time goes on, but then it turns from a "Not having enough money." problem to a "Not knowing how to do this." problem.
 

Machocruz

New member
Aug 6, 2010
88
0
0
Jonathan Hornsby said:
This is why The Hobbit movie is the absolute perfect example of my point, and the only evidence I need. People who were used to gaming and watching video at above 30fps didn?t have any problems with the movie, while those who had rarely ever seen anything moving above 30fps (if they've ever seen it at all) were literally getting physically sick watching it. This is evidence that long term exposure has trained you to function at 60+ fps at the expense of being comfortable with 30fps or lower. That is just common sense, backed up by actual demonstrated evidence on a mass scale due to that movie. You simply reject this concept because it conflicts with your ?60+ is objectively and universally better? preconceptions that are probably born of some kind of ?glorious PC gaming master race? styled bias.
WARNING: This is all almost completely tangential to your point,which I didn't realize to the end. But I already spent time typing it, so...

I had plenty of problems with the look of The Hobbit, and my gaming diet consists of PC games and twitch action games -all running at a minimum of 60fps. First we need to ditch the premise that fps' influence on the look of a film applies to games. It doesn't. Films are illusion (especially FX extravaganzas like The Hobbit), video games even more so because their visual elements are completely manufactured. And I think James Cameron will be the one to show that a movie can maintain the 'cinematic look' at higher framerates, which would make sense given that the world of Avatar is almost completely manufactured on computers. For 'analog' films, there are the influences of lenses, DoF, shutter speed, aperture size, light angle, etc. that don't exist (natively) in video games.

But back to the Hobbit. I would like to know how 24fps alleviates the horrible looking softcore porn lighting that was in many scenes. I've seen the films in all their forms, and they same visual issues exist. Apparently, 48fps is unflattering to daylight scenes in particular; this indicates there may be a light capture issue, or a light setup issue, and not an issue of more frames just making everything cheap looking. Dark scenes/scenes with high shadow contrast do not suffer from the same late-night-Cinemax problem. Then there is the deep depth of field with which they chose to film scenes with. A shallow depth of field is typical of the cinematic look, which is something you either don't get with: camcorders, as they don't provide a lot of focus features; or news footage, as they are not going for visual style. Citizen Kane is renowned for its usage of deep DoF, but now we are dealing with a capture method that picks out a higher level of visual detail, and sets and and lighting have not been adapted to this. This would not be a problem in a video game, where all these factors can be controlled by artists down to the nth detail.


Basically, people note that higher frame rates "look like video," yet they put the blame solely on those frame rates, when there are many other factors at play. Cinematography is a complex craft.

I know your point is about being conditioned to certain stimuli. When it comes to games, I agree that someone accustomed to playing at (and the key word is "playing")higher framerates would find lower framerates more troubling. I myself wouldn't even look at a character-action or fighting game if it wasn't up to the frame standard set by the best developers of such games. As it pertains to film, the headache thing seems to be real. And at this point I realize your point isn't about surface appearance, so why did I write all of this? Still, when you have actual developers making fallacious claims about how they are capping their game's framerate because it's more "cinematic," I think its worth pointing out why this is b.s. I can point to actual technical information that counters this belief too, but I try not to parrot things that I haven't experienced or tested for myself, unless requested.
 

Skeleon

New member
Nov 2, 2007
5,410
0
0
I'd very much expect it to simply be what people are used to. Once you are accustomed to the fluidity of 60 fps or above, going back is difficult. But if it's all you know, well. I actually had a bit of a similar experience regarding resolution. I remember way back when I was adamant that 1024*768 was more than enough for me and that higher resolutions than that just make everything so annoyingly small. Now, admittedly, I was a kid back then and computers were a lot slower also, but looking back I can't really understand why I was ever opposed to higher resolutions apart from what I was used to.
 

Machocruz

New member
Aug 6, 2010
88
0
0
Jonathan Hornsby said:
I agree with that, believe it or not. But it still has nothing whatsoever to do with the topic of this thread. Indeed the reason for my frustration with this thread is the implied insult inherent in the title and opening post; as if being able to enjoy, or even prefer, gaming at 30fps is some kind of negative trait and cause to be passively belittled and degraded.
I simply reject the OPs premise on the grounds that A.) 60fps is not foreign to consoles B.) there are PC users who are satisfied too, so there is no reason to define this as a console issue. Not to mention that even if console gamers here weren't satisfied, so what? AAA developers and hardware manufacturers are going to do whatever is fit for the larger market, not people on forums. The majority of consumers are probably not as invested or even aware of this issue.
 

DoPo

"You're not cleared for that."
Jan 30, 2012
8,665
0
0
Jonathan Hornsby said:
No; I didn?t. I used Olympic athletes as an example because it is a universal identifier of someone who has above average ability.
And "There are men in this world who can bench-press over a ton"? That's not "above average" that's a friggin' miracle. Even if there were people who can do that, and they aren't any, how much would you think they are? 0.01% of the population? 0.000001%? Because even that would be a lot for such a feat of strength. Assuming there really were people capable of something straight out of mythos, I'd be generous and say, there'd be about 0.00000007% or about 100 people. Who would probably look like Hulk and I can only assume be able to do other mythic stuff like squeeze rocks until they produce water, lift they sky, or arm wrestle a kraken. At any rate, even if you meant the top 0.01%, that's way, way above average.

As for Olympic-level - it is not "above average" as much as literally "world class". Olympic implies the best of the best - people whose entire career is sports, and not even all of them at that, compete at the Olympic games. The cream of the crop. And to top it off, you didn't even just make a comparison with Olympic-level "above averageness", which already raises the bar too damn high, your comparison included friggin' gold medalists from the Olympic games. So not only did you reference the best but the best of the best.

Jonathan Hornsby said:
I further clarified in other posts, if you would pay attention, that it is possible to ?train? oneself to do something to an above average degree through simple repetition and exposure rather than actual conscious exercise.
So when you said that the human brain is incapable of processing it, also, that it takes years of, and I quote, intense training, you were talking bullshit? Good to know you confess.

Jonathan Hornsby said:
This is why The Hobbit movie is the absolute perfect example of my point, and the only evidence I need. People who were used to gaming and watching video at above 30fps didn?t have any problems with the movie, while those who had rarely ever seen anything moving above 30fps (if they've ever seen it at all) were literally getting physically sick watching it.
Aside from the fact that this did not happen as you explained it, it only proves that it doesn't take years or training.

Jonathan Hornsby said:
has trained you to function at 60+ fps
What the fuck does this mean? I can read the sentence, I see there are words, but this part fails to parse. How does one "function" in FPS? Do I, like, think every >0.017 seconds? That makes no sense. What is this supposed to say?

Jonathan Hornsby said:
backed up by actual demonstrated evidence on a mass scale due to that movie
A movie that demonstrates that people can easily spot >30 FPS shows that people can't notice >30 FPS? Seriously, is that what you just said? Because there's been a fair amount of you going back and forth there, so I can't understand what your current claim is. I mean you did say people can't do it, then you said people can do it through intense exercise, just earlier in this post you decided to turn your back on even that claim, and said it's not that hard, I wouldn't be surprised if you are now arguing something completely different. Your goalposts are all over the place.

Jonathan Hornsby said:
You simply reject this concept because it conflicts with your ?60+ is objectively and universally better? preconceptions that are probably born of some kind of ?glorious PC gaming master race? styled bias.
Were in Shub-Niggurath's black forest did you get that from? Care to, you know PROVIDE EVIDENCE I ever said that. I'll settle for a quote of me making this exact claim. I'll wait.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Eh, while I'd prefer a million frames per second it just isn't necessary to do a story justice.

Look, sure, 4k screens are cripser than 1080 screens. But at some point I'm still sitting 10 ft away from a 42 inch screen and just don't really care that much.

If it's a beautiful game and a well made story those two things simply aren't going to make a difference. It's not that they are unimportant entirely. They're just not integral. Honestly, I couldn't personally tell you if one game was in 30 FPS or if another game was in 60 FPS. You'd have to sit me down in front of two screens and show me the same video from the same game before I'd hazard a guess. You can tell a difference, but it's almost subconsciously.

If I were a graphiophile, I would care about these things. But when I choose between my high powered PC and my ps4 my decision comes down to cost of the game, which controls I prefer to use on that type of game, and whether or not I want to play it in my living room. Frame rate and resolution just doesn't mean dick for me as long as it is at least 30 FPS and 1080p.
 

kickyourass

New member
Apr 17, 2010
1,429
0
0
Because if you place a 30 FPS scene and a 60 FPS scene side by side, I literally cannot tell the difference, so why should I care which one a game is being played at?
 

EternallyBored

Terminally Apathetic
Jun 17, 2013
1,434
0
0
Skeleon said:
I'd very much expect it to simply be what people are used to. Once you are accustomed to the fluidity of 60 fps or above, going back is difficult. But if it's all you know, well. I actually had a bit of a similar experience regarding resolution. I remember way back when I was adamant that 1024*768 was more than enough for me and that higher resolutions than that just make everything so annoyingly small. Now, admittedly, I was a kid back then and computers were a lot slower also, but looking back I can't really understand why I was ever opposed to higher resolutions apart from what I was used to.
Eh, I never had any difficulty going back and forth, the most notable examples would be switching on the same FPS game from my PC to the console version at a friend's house, it's noticeable, but I usually stop noticing or caring after about 5 minutes of playing. If I was the type of high end fighting game player that counted moves by pixels and frames it would probably be a larger hurdle to overcome (are there even any fighting games that run at 30 FPS? I don't know much about the genre).

60 FPS is nice, but it rarely ends up on a list of features when I'm considering whether to buy a game or not, even now that I have a computer that gives me the luxury of choosing which version I want to play.
 

Machocruz

New member
Aug 6, 2010
88
0
0
Lightknight said:
You can tell a difference, but it's almost subconsciously.
You can tell by playing. Or not, depending on what genres you're experienced in. Big difference between a fighting game or other intense action game at 30 and one at 60+. Games are about tactile interaction. Why people keep framing this as a graphics issue, I do not know.

Reminds me of people who say a game "looks" boring and can't understand why it has a large following. I say: maybe because it "feels" great. Seems like people spend more time looking at games than actually playing them (this is not directed at you, I'm just saying).
 

SonOfVoorhees

New member
Aug 3, 2011
3,509
0
0
The point your missing is PC gamers have had 60fps for years - some brag about anything less than 120fps is unplayable. End of the day no one cared about fps until 360/ps3 and ive been gaming since Atari. Its just a way for people to continue their fanboy arguing and for PC gamers to feel included - i know they hate that the gaming argument is always about consoles and the pc isnt included. End of the day, enjoy the platform your using and the games your playing - thats all that matters. All this fanboyism just sounds like people are insecure about their console of choice and have to brag about it. When did gaming become about everything other than the games?
 

SmugFrog

Ribbit
Sep 4, 2008
1,239
4
43
Strazdas said:
there are websites that support native 60fps videos, but the name slipepd my mind now sadly.
Well, call me a believer... http://30vs60.com/

Even watching that though, if you only every played in 30 fps you wouldn't know what you're missing. It's just smoother - like having a better graphics card and better computer. What really convinced me is this one:

http://www.30vs60fps.com/

Wow. I always thought it was just standard of me turning really fast - but no; logically more frames per second you're going to see more as you turn. That would really help out in multiplayer games where your survival depends on whether or not you spot an enemy.

I don't understand how any of you can continue to argue about 30 vs 60 as far as whether or not we can SEE it when the evidence is right there. As far as my preference? Well obviously 60 is great; but I'd rather have a constant stable framerate and good controls. To me, graphics really are secondary to a fun game with good gameplay.

This is interesting too - and he does a 120 FPS one if you have a 120 hz monitor. http://boallen.com/fps-compare.html
 

TKretts3

New member
Jul 20, 2010
432
0
0
I've played at 30 fps, I've played at 60 fps. I honestly can't tell the difference.
Then again 720p and 1080p movies don't look too different to me, so maybe I'm not the best judge.