[Update 2] How/why are console gamers satisfied with 30 fps?

RandV80

New member
Oct 1, 2009
1,507
0
0
Gundam GP01 said:
Sir Thomas Sean Connery said:
Gundam GP01 said:
small said:
Gundam GP01 said:
erttheking said:
Gundam GP01 said:
erttheking said:
Gundam GP01 said:
erttheking said:
Because I quite simply don't care. Graphics come in dead last when it comes to making a game enjoyable. FPS fall into that category for me. I don't care what the FPS are on a game so long as its fun to play.
What, and virtually doubling your potential reaction time doesn't do that?
For games that rely on fast reflexes, like twitch shooters, DMC style brawlers, fighting games, and high speed racing games, a higher framerate is virtually mandatory.

I imagine that attempting to play Metal Gear Rising at only 30 FPS would be a lot more frustrating than it would be at a smooth 60.
*Shrugs* Revengance played just fine on my 360. Don't really see what the problem is.
That's because it was running at 60 FPS on consoles as well as PC. Imagine playing it with half of the frames cut out.
Or if you have a PS4, grab the remaster of The Last of Us, play for a while without changing anything, then go into the options and lock the framerate to 30 fps. But if you can't do that either, play a level of MGR and the DmC reboot back to back and see how your reaction time changes and see if you can tell which one seems smoother.
How is it so hard to believe that it doesn't matter to some people?

I could see a minor difference in the video in the links, but it wasn't enough for me to care. Both still looked perfectly good enough to play.
I think the fact you had to spell it out for me that it was 60 fps and not 30 fps should tell you how little impact that fps has on me.
Try this then.
http://30vs60.com/formula1.php
Or this
http://www.30vs60fps.com/
Or this
http://boallen.com/fps-compare.html

It should be easy to see how framerate effects gameplay with these sources.
thank you for posting those links i was curious to see the difference myself and.. apart from the bottom link example spinning faster at 15fps i couldnt actually see a difference in any of them
Do all of you people have 30Hz monitors or something?
How is it so hard to believe that it doesn't matter for some people?

I saw a difference in those links, sure, but it wasn't enough to care about. Both the 20 and the 60 videos looked perfectly good enough to play.
It isn't that it doesn't matter to them, it's that they're saying the can't see any difference when that difference is so obvious to me.
I'm a PC gamer and I didn't really notice a difference in the Formula 1. You know what I think is the problem here? There isn't a term for it but we're looking at the visual equivalent of an audiophile.

Forget expensive PC rigs, there are people who will spend thousands of dollars on stereo equipment to get that perfect sound. MP3's are good enough for the vast majority of us but for these people they are blasphemous, if it's digital they'll only deal with the FLAC format, or keep vinyl collections, or whatever. But like I said, for most of us an MP3 is 'good enough'.

Personally when it comes to audio I don't really notice a difference between using my TV's default mono or having the stereo system turned on with surround sound going, or get no more enjoyment from the top of the line sound you get at a big movie theater. Something I did absolutely love though? The 3D enhanced visual experience of Avatar. Which on the internet people love to hate and complain endlessly how useless 3D and since avatar came out how it would just be a fad that's going to disappear. So obviously I'm completely psyched for the Occulus Rift to hit the shelves, now there's something I'll sink money into.

Basically, we all have our personal tastes & preferences, as well as various ability to perceive input to the various senses.
 

Elijin

Elite Muppet
Legacy
Feb 15, 2009
2,045
1,007
118
Prime_Hunter_H01 said:
FPS sensitivity is a learned skill. And really it depends on the person. But hang around the right people and it seems common and you only see the "others" that don't care.

I like to make a car comparison, A car guy is like a heavy pc gamer, because of their interests they have trained them self to see and feel ever small change in performance and it really affect them, like the smoothness of the ride or the sound of a good engine, while an normal person get what they want and will only see/feel an extreme change. And its the same with FPS, 30 is smooth to everyone who hasn't trained them self to see it because of an interest in it.

So the sentiment stands, if they don't care they don't see it.

The difference is only major to those who care, and can even see it.

And for me I see barely a difference unless is took a minute to really stare at the 30vs60 comparison website before I saw a minuscule difference
This analogy is pretty on point.

High end PC gamers notice because they chase it. If you've never chased it, you'd find it isnt anywhere near as big of a deal.
 

Gray-Philosophy

New member
Sep 19, 2014
137
0
0
JohnnyDelRay said:
bobleponge said:
Movies have been 24 fps for forever and that's been good enough for me. Art direction and storytelling are way more important (also lighting. It's nice to be able to see things, developers).
Yes, this is what I've been wondering too. Aren't all movies at 24fps? Or is this only from the NTSC/PAL days? Now it depends on the refresh rate of the TV doesn't it?

-snippity-
It has to do with the way that the human eye and camera lenses register light and motion, or FPS.
The average human eye is able to register about the equivalent of 24 FPS of movement. Our brain then automatically smoothes out the image to adjust for any "missing frames". This is why we experience motion blur when things move faster than what we can register at 24 FPS. Try frantically waving your hand back and forth in front of your face, most of you should see blurred fingers rather than a super smooth and easily distinguishable motion.

The average camera works very similar to that, it records 24 still frames every second, anything moving too fast becomes a blur. Now of course, there are cameras capable of recording images at a much higher framerate to get really smooth super slowmotion shots, but the motion recorded by the average camera is still adjusted, and motion blur is interpreted by the brain as smooth motion, making FPS less significant in recorded films.

3D animation is not a recorded image though, and therefore not subject to the 24 FPS limitation and motion blur. The reason we can tell the difference in FPS in games is because there is no actual motion going on, the images on the screen are produced by light being projected straight at our eyes rather than light reflecting off a moving physical object.

Detecting differences up to and beyond the 60 FPS mark probably has a lot to do with adaptation though, explaining why some just don't see the difference and some do.
Pidgeons are able to register the equivalent of about 224 frames per second, causing them to experience their visual perception in a state of slow motion compared to humans. This is also why they don't usually move untill the last moment, in their minds they still have plenty of time to get away. They would probably also be bored out of their minds if they watched a 24 fps movie.

As for me, there's no denying that I'm definitely more pleased with a higher framerate and resolution, but it's not much worth if the game is still a shitty game. I also have no problem coping with low FPS, but too low does become a bother eventually.
 
Dec 16, 2009
1,774
0
0
well, I'm not really a console gamer anymore (not sure if the Wii U counts, nor what its avg frame rate is), but I'd say it's down to not knowing any better.

I don't mean that in a dismissive sense; if you've never seen how much smoother 60fps is while gaming,n its not going to effect you as you don't know what you're missing.

thats my take anyway
 

jklinders

New member
Sep 21, 2010
945
0
0
CrystalShadow said:
Because it's always a trade-off.

I'm one of those people that does both, but because PC's are so flexible, I often experiment with settings.

What framerate I will put up with really depends on the game.

Some games are really, really bad at low framerates.
Even console games. (One that comes to mind is bit.trip beat - Even on the Wii, where the only 'framerate' choices are Pal 50hz, vs 60 hz mode, 60 hz is better for gameplay)

Thing is, in a lot of games 60 fps just doesn't get you anything.

And the price you pay is reduced graphical quality.

Less detailed meshes, lower resolution textures (lower resolutions generally), lower quality effects... Etc.

As an example, on PC, with a mid-range system playing, say, the original Crysis,

Going from 30 fps to 60 is the difference between playing on one of the highest graphical settings versus one of the lowest.

The improvements in framerate come at such a huge cost to visual quality, it's clearly only something you'd do if you were absolutely desperate, or if it was having a very large, and very obvious negative effect on being able to play the game...

I for one, don't like making that trade-off, so I often opt for lower framerates, but higher visual quality.
(Except for AA. AA can suck it, mostly. I mean, yes, it makes things look better, usually, but typically at the expense of AT LEAST a 15 fps drop, in my experience. (And often a lot more). Besides which, if you have the choice between AA and simply running the game in a higher resolution, higher resolutions win every time. Why play a game at 1280x720 with 4x AA when you can play it at 2560x1600 with AA turned off?)
This encapsulates my view on the matter perfectly. The trade off of graphics vs FPS is simply not worth it for me. I don't do too much twitch gaming so that has something to do with it for me but as long as the frames are moving smoothly in real time and 30-35 fps does that just fine then I'm OK with it.
 

zumbledum

New member
Nov 13, 2011
673
0
0
Aaron Sylvester said:
Over the last 15+ years most consoles have been aiming for 30fps mark and I don't see that trend changing for the next decade to come. Ubisoft just announced that AC Unity will run at 900p and be locked to 30fps on PS4 and XBone for "platform equality", which was hilarious. Even better was their attempt to justify/explain that, but nothing can answer the question of why current-gen consoles (approximately 3-5x more powerful) are running at the same framerate and barely higher resolution than last-gen.

But in any case, it would be foolish to blame Ubisoft (or any developer) for making such a decision - after all they are businesses first. Console gamers have clearly spoken with their wallets and reviews/feedback "we don't care if the game runs at 30fps or 60fps...just make sure it looks nice".

Of course devs/publishers are more than happy to do exactly that. In gameplay trailers, video reviews, etc framerate is not a factor and can go completely ignored, it's all about visuals and resolution. Even in reviews the framerate of console games is rarely criticized, praised or even mentioned unless it keeps dipping down to 20fps (lol?). On PC a game running at 30fps vs 60fps is worlds apart and can make (or break) the experience for me, and lets not even get started on the awesomeness of 120-144hz. I know for a fact that the PC crowd is quick to CRUCIFY any game that dares lock itself to 30fps, we've seen that happen time and again.

So my question to the console gamers of the Escapist is this - how are you happy with playing games at 30fps? Is there really no difference playing at 60, or is that difference really so negligible that you would prefer better resolution/visuals?

The whole FPS argument is so badly understood and argued, you just cant compare. 30 fps on a console with a large tv screen that your normally a lot further away from playing with the less precise controller , is going to be far more watchable than the same 30 fps on a high res monitor viewed up close.

All sorts of things from brightness to motion affect how we see things
this link is to a page that must be about 20 years old now i first found it in my quake 2 days and its by far the best explanation of the issue ive ever seen.

http://www.100fps.com/how_many_frames_can_humans_see.htm
 

M0rp43vs

Most Refined Escapist
Jul 4, 2008
2,249
0
0
I'm gonna be a bit nicer in this thread.

60 FPS is LIKE crystal extremely real life looking graphics, Nice to have and I won't be pissy experiencing it but I'll eventually won't notice it, wont complain if a game at least has passable versions of it and I tend to enjoy interesting gameplay more any way.

Sorry dude, but for most people, it's a novelty.
 

RicoADF

Welcome back Commander
Jun 2, 2009
3,147
0
0
Aaron Sylvester said:
Because most TV's frame rate are set at 30fps so having the game render above that is pointless since the TV wouldn't display the extra frames (or 24fps for us in PAL regions, NTSC was the 29.97 fps rate). Alot of modern HD TV's can do higher but movies/game consoles etc still follow the 30fps cap to keep to the standard.

Also as someone in the PC master race I find the fps whinging ridiculous, anything above 30fps is all you need. People who complain about sub 60 or 120fps are just trying to stroke their ego and focus far too much on the numbers. Consistant frame rate matters more, you notice stutter if the frame rate varies, if it stays consistent then it looks smooth regardless of the numbers.

Jonathan Hornsby said:
The reason for this is actually very simple, but something nearly all the diehard PC gamers here will instantly protest; you guys can only see a difference because you've TRAINED yourselves to see it. You've all spent so long tweaking settings and working your butts off (to say nothing of paying small fortunes) to squeeze the absolute most out of your rigs that you have thought yourselves to see and criticize every little minute detail to a degree that the average viewer, or I guess you could say casual viewer, just can't see. Want another example? I can't tell a difference between 720p and 1080p. And I have 20/10 vision. You guys can only see these differences because you've spend so long trying to that you've found a away to make your eyes more sensitive to these details. Good for you, an impressive feat to be sure. But that doesn't mean something running at 720p and 30fps looks like crap to everyone; that's just you and your unnaturally acute super-eyes that you've spent way too long cultivating. Now get up and go outside; staring at a screen that intently for that long can't be good for you.
You've pretty much hit the nail on the head here mate. For most people as long as it's above 30fps and stays consistant then it's perfect, you can only see the difference if you focus on it and want to see the difference, because you've gotten used to picking it up. Like someone who drives a sports car can't stand a small VW, yeah the sports car is 'superior' but for everyday joe blow the VW is more than adequate.
 

Darkong

New member
Nov 6, 2007
217
0
0
RicoADF said:
Because most TV's frame rate are set at 30fps so having the game render above that is pointless since the TV wouldn't display the extra frames (or 24fps for us in PAL regions, NTSC was the 29.97 fps rate). Alot of modern HD TV's can do higher but movies/game consoles etc still follow the 30fps cap to keep to the standard.

Also as someone in the PC master race I find the fps whinging ridiculous, anything above 30fps is all you need. People who complain about sub 60 or 120fps are just trying to stroke their ego and focus far too much on the numbers. Consistant frame rate matters more, you notice stutter if the frame rate varies, if it stays consistent then it looks smooth regardless of the numbers.
TVs set for 30 fps? Christ man, how old is your TV then? The standard TV refresh rate for every LCD is 60hz, you're confusing the frame rate of film with the refresh rate of a screen. It's nothing about keeping a standard and all about ease of optimising, 30 is seen as an acceptable bare minimum, nothing more.

I don't understand why people are so eager to swallow up the excuses from publishers about restricted frame rates, frankly if Nintendo can get 60fps from the WiiU there is no good reason why the two more powerful consoles can't at least do the same. What's the point of having the new console generation with more powerful machines if they're barely producing better than what the previous generation was capable of?
 

Trivun

Stabat mater dolorosa
Dec 13, 2008
9,831
0
0
erttheking said:
Because I quite simply don't care. Graphics come in dead last when it comes to making a game enjoyable. FPS fall into that category for me. I don't care what the FPS are on a game so long as its fun to play.
Massively ninja'd. I enjoy playing the older Pokémon games as much as I do the newer ones. I love Fallout 3 despite being released with graphics of a lower capability than New Vegas. I'll happily play indie games on Kongregate without them needing to match Halo 4's frame rate. Hell, I'm putting off buying a PS4 until the price drops, so until it does I'll gladly buy any games that are receiving PS3 and PS4 release on the older console, for the sake of playing a good game - FPS means absolutely nothing to me...
 

Savagezion

New member
Mar 28, 2010
2,455
0
0
Darkong said:
RicoADF said:
Because most TV's frame rate are set at 30fps so having the game render above that is pointless since the TV wouldn't display the extra frames (or 24fps for us in PAL regions, NTSC was the 29.97 fps rate). Alot of modern HD TV's can do higher but movies/game consoles etc still follow the 30fps cap to keep to the standard.

Also as someone in the PC master race I find the fps whinging ridiculous, anything above 30fps is all you need. People who complain about sub 60 or 120fps are just trying to stroke their ego and focus far too much on the numbers. Consistant frame rate matters more, you notice stutter if the frame rate varies, if it stays consistent then it looks smooth regardless of the numbers.
TVs set for 30 fps? Christ man, how old is your TV then? The standard TV refresh rate for every LCD is 60hz, you're confusing the frame rate of film with the refresh rate of a screen. It's nothing about keeping a standard and all about ease of optimising, 30 is seen as an acceptable bare minimum, nothing more.

I don't understand why people are so eager to swallow up the excuses from publishers about restricted frame rates, frankly if Nintendo can get 60fps from the WiiU there is no good reason why the two more powerful consoles can't at least do the same. What's the point of having the new console generation with more powerful machines if they're barely producing better than what the previous generation was capable of?
Because the truth is graphics do matter to more people than will admit. They actually will let it hinder the smoothness of the gameplay in favor of it. I find it hard to accept that some people can't notice the difference between 60 FPS and 30. Usually when you yell "I pushed the button!" say hello to 30 FPS. I would be willing to bet a lot of people saying "I don't care" have also yelled "I pushed the button!" out of frustration a lot.

I really like this post for pointing out the hypocrisy:

RandV80 said:
The reasoning is perfectly understandable, as a PC gamer my rig is never cutting edge and I've been playing games at lower setting for 15 years without much concern. But at the same time there's a whole lot of hypocrisy here. OP names them 'console gamers' but realistically he's talking about PS/Xbox fans, and when it comes to Nintendo fans they'll lord the graphics & hardware superiority over them. Then PC gamers poke there nose into the conversation 'hey we're actually better than you there' and suddenly these things don't matter anymore.

For the current gen the defensive argument usually comes from Xbox owners/fans since they're on the short end of the stick here with PS fans the aggressors, but you just know if the One could do 1080P/60fps but the PS4 was lagging behind with some combination of 720P/30fps then the positions would seamlessly flip.
I am a PC/PS gamer mostly. I have played my share of 30 fps games and 60 FPS games. 60 is noticeably different. I am not a graphics whore, but I do like the options photo realism offers. Just like I like what anime style graphics and cartoon style graphics offer. For me, it is all about what you do with your engine. Trust me the "good enough" bar is pretty low. Some games, I accept 30 FPS because there is no 60 FPS option available but the game is good enough to make me want to play it despite that. However, if 60 FPS was available, I would pick it every time on purpose. I wouldn't ever choose the 30 FPS and I do wish 60 FPS was the standard. I have seen people want 90 FPS to be the standard which is insane for me as I don't think my reflexes are that fast anyways.
 

DarkhoIlow

New member
Dec 31, 2009
2,531
0
0
I cannot play games in 30fps..I just can't..I can notice it immediately and notice the slow down compare to 30fps (the most recent being Dead Rising 3..at 30fps it was unplayable for me until I unlocked the framerate).

60fps is objectively better than 30, but if the people don't care than that's their prerogative. It bothers me a lot why they choose to play their games in slow motion but ehh whatever. I guess once they got used to 30fps for playing on consoles since forever and anything that is different than that (the random X or Y game that is 60fps) they think it's a anomaly or it's "too fast".

It boggles my mind and will continue to do so until the point where they will get powerful enough consoles that CAN do 60fps on all their games and then accept the fact that from a gameplay standpoint it is and always be better.
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
elvor0 said:
For some stuff it's like putting someones glasses on when you're not supposed to wear glasses, but worse. Seriously, play an xbox 360 game in non HD, then play it in HD. Washed out colours, very blurry viewing, mini maps are unreadable, so is text, pin point accuracy is out the window. I have to really strain my eyes to see what I'm doing if I have to play a game on a non HD TV, which leads to eyes hurting.
Aside from the text issue, not really. It may look better, but in going beyond that you've reached into levels of hyperbole usually reserved for reporting about the President's secret Muslim strategy to forcibly gay marry everyone.

Smooth Operator said:
I mentioned that in the final part of my post, people who don't want or care to notice the difference between red and blue just won't. For those who do care to notice however it becomes impossible to not see it.
That sort of looks like it rendered your list of other excuses completely meaningless.
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
RedDeadFred said:
SARCASM ALERT!
It's also possible that I am just superior to you and am therefore able to subconsciously adjust my eyes for any demand that video games bring to them. Mortals are so sad.
SARCASM OVER!
*rolls 1 on Sarcasm Save*

oh, great and glorious one! How shall we worship you!

...Sorry, couldn't resist.
spwatkins said:
I think that the issue comes down to the whole 720P vs 900P vs 1080P debate. I guess that the game makers that evaluate the tradeoffs get much more grief about not being 1080P then they do about the framerate being 30 vs. 60. Maybe a lot of the people weighing in negatively here about the framerate would just switch to complaining about the lines of resolution if all the games ran at 60 FPS.
Frame rate isn't the buzz word that has been sold to the public, though. If not for TVs, 1080p probably wouldn't matter to the public at all. But they advertised all the ps they had and people know what that is.

If, perhaps, we see an increase in higher framerate movies, we might see people care about FPS. But like the number of ps, they probably won't know/care why.
 

remnant_phoenix

New member
Apr 4, 2011
1,439
0
0
Because CONTENT is what makes a game good.

To me, caring about 30fps vs 60fps (or whatever else) is like caring about whether someone hand-wrapped my birthday gift, or had it gift-wrapped with cool-looking ribbon. The box/wrapping/etc is merely the method of delivery for what really matters.

Does a fancily gift-wrapped present look better than a hand-wrapped one? As a general rule, yes. Does it improve the overall experience of receiving the gift? Sure. But its the content that's going to stay with you, so that's what really matters in the end.
 

ILikeEggs

New member
Mar 30, 2011
64
0
0
120 FPS master-race reporting for duty.

Now if only I could get my hands on a cheap 120hz monitor.
 

DOOM GUY

Welcome to the Fantasy Zone
Jul 3, 2010
914
0
0
60fps does feel a lot better to me, it's just smoother and overall a better experience, however, when a console game is 30fps, it's a bit better than a PC game running at only 30fps, because on the console it's actually optimized to run at that frame rate.

Still, I'd take 60fps over 30fps any day.
 

Morgoth780

New member
Aug 6, 2014
152
0
0
porunga said:
DarkhoIlow said:
60fps is objectively better than 30, but if the people don't care than that's their prerogative.
No it isn't. It's objectively FASTER. It's subjectively better.
In what way would 30 fps be subjectively better than 60? Perhaps it would allow for higher fidelity graphics if you have fixed hardware specs, but that's all I can think of.