[Update 2] How/why are console gamers satisfied with 30 fps?

DoPo

"You're not cleared for that."
Jan 30, 2012
8,665
0
0
CrystalShadow said:
When we look at old-fashioned film stock, this is being projected at 24 fps, but it takes physical time for the projector to advance the frame. Typically, what the projector does is blank the frame during the transition (with a physical barrier.)
This makes each frame a distinct thing with a blank (dark) period during the transition.
But since people can typically identify what's going on at such low framerates, there's no blur to it, and we may even be able to spot the 'blank' period (causing the image to flicker)
I'm not sure what you mean there - movies do have motion blur. It's...well, "built into" them, I suppose but each frame is not just a still photo of exactly one moment in time - when filmed at 24 FPS, you get 24 frames that altogether encompass a whole second. In other words, each frame captures a 1/24th of a second (or about 40 milliseconds) - all of what happened in that time. Therefore, if an object is moving, you would get its position throughout the whole exposure time thus capturing the motion blur. Everything in the movies has a motion blur because of that. So when you are shown the 24 frames the transition between them is much smoother to the human eye for it's closer to what real life looks like, i.e., not a collection of still images but continuous motion.
 

Master Taffer

New member
Aug 4, 2010
67
0
0
To be frank, I don't care. I care about consistent framerate without any drops, not getting it as high as I can get. Framerate, resolution, graphics, etcetera are all things that get pushed to the back of my mind while playing a game. I should be focused on gameplay, environment, and narrative while I'm playing a game; not performance. That's called immersion and/or engagement. If I'm thinking about performance something has gone wrong. 30 FPS can also evoke a completely different mood compared to 60 FPS, so there's a legitimate argument for it when it comes to aesthetic preference of the developer.

I'm sure it's nice having 60 FPS on a game like Destiny, Titanfall, Rainbow Six, etcetera where reaction time matters more, but for a game like Assassin's Creed? I don't care.
 

linwolf

New member
Jan 9, 2010
1,227
0
0
Master Taffer said:
To be frank, I don't care. I care about consistent framerate without any drops, not getting it as high as I can get. Framerate, resolution, graphics, etcetera are all things that get pushed to the back of my mind while playing a game. I should be focused on gameplay, environment, and narrative while I'm playing a game; not performance. That's called immersion and/or engagement. If I'm thinking about performance something has gone wrong. 30 FPS can also evoke a completely different mood compared to 60 FPS, so there's a legitimate argument for it when it comes to aesthetic preference of the developer.

I'm sure it's nice having 60 FPS on a game like Destiny, Titanfall, Rainbow Six, etcetera where reaction time matters more, but for a game like Assassin's Creed? I don't care.
The thing is immersion is the reason that I care about fps when I watch other play I rarely note if the game is only running at 30 but when I play myself I get the feeling that my movement is sluggish and that brings me out of the game.
Around 45+ that feeling disappears and I can stay focus on the game. Everything above this just makes it feel smoother but isn't necessary.
 

UsefulPlayer 1

New member
Feb 22, 2008
1,776
0
0
I've been playing Assassin's Creed 2 and Spec Ops: The Line on my Xbox 360. I've beaten both games before, but I thought I'd play them again. They are as fun as I remember.

So yeah, I guess I'm pretty satisfied. Hopefully Dragon Age Inquisition is good when it comes out.

You guys take it too seriously. Just try to have some fun....
 

Diaconu Cristian

New member
Oct 21, 2011
10
0
0
Sort of late to the party but I feel like stating my point of view as a PC and console player: the problem is not 30fps or 60fps or 720p or 1080p or whatever p. The problem is consistency... inconsistent frame rates (10fps to 40fps than back to 10 fps) or fps drop (bloody 20fps for 10 seconds or more continuously) when the game was advertised as 30fps locked is the real problem.
And here is why a lot of people make a fuss out of this: many people like me might be interested in purchasing one for these new consoles and they wish to know if these bloody machines can keep that 30fps locked (because advertising is one thing and the reality is another especially for multi-platform games).
If people are satisfied so easily with whatever p and whatever framerate soon you will have 600p and 25fps for PS4 because the games are "good". Graphics may not be the most important in a video game but if they wouldn't matter than we would all play text based adventure games :p
PS: sorry for my bad engrish :D
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
Diaconu Cristian said:
many people like me might be interested in purchasing one for these new consoles and they wish to know if these bloody machines can keep that 30fps locked
For console games the developer is responsible for giving you a consistent framerate. It is always possible to make the game a little prettier at the cost of framerate, good developers will know when to stop while bad developers can make any machine laggy.

One reason why PC gamers care more about framerate is that we are responsible for our framerates. PC games are rarely locked at 30 fps, so when we see 30 fps we naturally worry that it will drop below 20 fps when something exciting happens. So we lower settings until we get 60. 60 fps doesn't just look a bit better, it reassures us that our game will remain playable. Maybe we overreact to 30 fps console games. Reducing FOV to keep a game stable is just bad though.
 

ninjaRiv

New member
Aug 25, 2010
986
0
0
Because I don't care and I don't think any console gamer cares. It's not a huge difference at all and we don't play games for the frame rate. People who care about stuff like that buy a gaming PC, don't they?

Not that increased frame rate and extra pretty graphics aren't nice,of course; shiny graphics and the like are nice for the more cinematic games. But I just really don't NEED a higher frame rate to enjoy a game. I don't play a game and think "Oh My God, this game needs 60FPS. Everything about it is dead without that." Nobody NEEDS it. It's nice, sure! Just not needed.
 

CrystalShadow

don't upset the insane catgirl
Apr 11, 2009
3,829
0
0
DoPo said:
CrystalShadow said:
When we look at old-fashioned film stock, this is being projected at 24 fps, but it takes physical time for the projector to advance the frame. Typically, what the projector does is blank the frame during the transition (with a physical barrier.)
This makes each frame a distinct thing with a blank (dark) period during the transition.
But since people can typically identify what's going on at such low framerates, there's no blur to it, and we may even be able to spot the 'blank' period (causing the image to flicker)
I'm not sure what you mean there - movies do have motion blur. It's...well, "built into" them, I suppose but each frame is not just a still photo of exactly one moment in time - when filmed at 24 FPS, you get 24 frames that altogether encompass a whole second. In other words, each frame captures a 1/24th of a second (or about 40 milliseconds) - all of what happened in that time. Therefore, if an object is moving, you would get its position throughout the whole exposure time thus capturing the motion blur. Everything in the movies has a motion blur because of that. So when you are shown the 24 frames the transition between them is much smoother to the human eye for it's closer to what real life looks like, i.e., not a collection of still images but continuous motion.
Yeah, that's not what I was referring to. I tried to explain that later on, but I'll restate it:

Motion blur on the film stock is a result of blurring captured by the camera (if CGI images were transferred to film stock there may not be any motion blur on the frames). However, at 24 frames per second, each frame is percieved by the eye as an individual thing.
At, say, 48 frames per second, this isn't the case. you lose the identifiable transition between any two frames.
This is much the same as motion blur, but happens at a different point in the overall process.

You're confusing what the camera captured, with what the eye does with the footage when it's played back. It's the destinction between the objects captured on film being blurred in the film footage itself, and the eye blurring together the content of two or more frames because the framerate is higher than what the eye can perceive clearly.
(At 24 fps, not only does the eye not appreciably blur together adjacent frames, but the transition period during which the projector shutter is closed may be long enough to appear as a dark frame, or at least, flickering in the image.)

Anyway, both of these effects are forms of motion blur, but they have different causes.
Playing back footage at 60 fps would cause blurring together of multiple frames.
The blur recorded on any particular frame on the other hand depends on the camera, it's shutter speed, the recording framerate, and many other factors, all of which can lead to more, or less blurring than would be expected if you were witnessing the actual events the camera was recording. (Because in that case, the motion blur is a function of the camera used, while actually being there makes it a function of the limits of the human eye.)

Anyway, not sure if that's any clearer, but I don't know how else to explain it really...
 

asdfen

New member
Oct 27, 2011
226
0
0
I hate choppy frame rate as it is really hard on my eyes. I would prefer all games to have 30 minimum fps and 45+ average fps. I hate the same about movies that are usually played at 24fps but I really cannot influence that unless its a pc game where my hardware is the limitation.