[Update 2] How/why are console gamers satisfied with 30 fps?

SmugFrog

Ribbit
Sep 4, 2008
1,239
4
43
TKretts3 said:
I've played at 30 fps, I've played at 60 fps. I honestly can't tell the difference.
Then again 720p and 1080p movies don't look too different to me, so maybe I'm not the best judge.
That's pretty much my opinion of it. Unless you've played a game a lot at 60 fps then go to 30 (or watched a movie many times over in 1080, then watch it in 720) - something will just seem off about it. So I'm convinced now about being able to notice the difference - I never had a powerhouse computer for testing it for myself, nor dared to care about getting that 60 fps when I just want a game to be playable (20+ with max graphics is a good day to me). And as you say, you don't notice a difference - my wife is the same. I point out how amazing the graphics are in a game compared to my last computer and she says she doesn't see it.
 

Aaron Sylvester

New member
Jul 1, 2012
786
0
0
Easy way to for anyone to find out if they can see the difference between 30fps and 60fps: http://frames-per-second.appspot.com
Mind you that's just visual, the input lag difference between 30 and 60 fps is a whole other story.
Jonathan Hornsby said:
Morgoth780 said:
Jonathan Hornsby said:
DoPo said:
So, on one hand, we have actual scientific evidence that you're wrong, on the other we have your word for it. It's hard but I think I won't take your word for it unless you provide a citation, as requested.
How about the pages and pages of people in this very thread saying they can't tell the difference?
What about the many, many PC gamers who say there's a huge difference between 60 and 30?
As I said in a previous post in this thread; you've trained yourself to see it. By default a person's senses aren't that...fine tuned. Not to lavish you with undo praise, but the comparison is somewhat like basing the average person's running speed solely on Olympic gold medalists. Simple fact is that there is a range for this, a certain tolerance for error. No two people are exactly alike, but there is a minimum and maximum sensitively, and those of us on the low end of the spectrum who haven't spent years obsessing over pixels are already reaching our limit. Yours is a bit higher, good for you, won't be too long before your limit is reached too. The actual average gamer is already about at their limit.
Well that would explain why even 60 fps feels "choppy" for me now I've been gaming at 120hz for nearly 2 years, and 30fps is basically unplayable.

So I guess it really comes down to raising standards and raising the bar. There are only benefits so why not?
 

DoPo

"You're not cleared for that."
Jan 30, 2012
8,665
0
0
Jonathan Hornsby said:
Aaron Sylvester said:
Easy way to for anyone to find out if they can see the difference between 30fps and 60fps: http://frames-per-second.appspot.com
Mind you that's just visual, the input lag difference between 30 and 60 fps is a whole other story.
Jonathan Hornsby said:
Morgoth780 said:
Jonathan Hornsby said:
DoPo said:
So, on one hand, we have actual scientific evidence that you're wrong, on the other we have your word for it. It's hard but I think I won't take your word for it unless you provide a citation, as requested.
How about the pages and pages of people in this very thread saying they can't tell the difference?
What about the many, many PC gamers who say there's a huge difference between 60 and 30?
As I said in a previous post in this thread; you've trained yourself to see it. By default a person's senses aren't that...fine tuned. Not to lavish you with undo praise, but the comparison is somewhat like basing the average person's running speed solely on Olympic gold medalists. Simple fact is that there is a range for this, a certain tolerance for error. No two people are exactly alike, but there is a minimum and maximum sensitively, and those of us on the low end of the spectrum who haven't spent years obsessing over pixels are already reaching our limit. Yours is a bit higher, good for you, won't be too long before your limit is reached too. The actual average gamer is already about at their limit.
Well that would explain why even 60 fps feels "choppy" for me now I've been gaming at 120hz for nearly 2 years, and 30fps is basically unplayable.

So I guess it really comes down to raising standards and raising the bar. There are only benefits so why not?
Finally someone who at least gets the gist of what I'm saying. People in this thread may not think of spending thirty plus hours a week gaming at 60+ frames per second as "training" for it, but biologically speaking that is exactly what they're doing. Especially if they've been doing it for a number of years now.
You compared it TO THE OLYMPIC GAMES, in Hades' sake! So basically that says that not only does it need to be your entire job, you have to dedicate your time to "honing" your "skill" and be among the best...or, I dunno "the best". How is one "the best" at FPS I don't know, but that's what you suggested.

Also, I'm still waiting for that quote, you know.
 

The Lunatic

Princess
Jun 3, 2010
2,291
0
0
In regards to "Being used to it".

Yeah, people thought Gran Turismo 3 on the PS2 was "Photo-realistic" too.
 

Flammablezeus

New member
Dec 19, 2013
408
0
0
SonOfVoorhees said:
The point your missing is PC gamers have had 60fps for years - some brag about anything less than 120fps is unplayable. End of the day no one cared about fps until 360/ps3 and ive been gaming since Atari. Its just a way for people to continue their fanboy arguing and for PC gamers to feel included - i know they hate that the gaming argument is always about consoles and the pc isnt included. End of the day, enjoy the platform your using and the games your playing - thats all that matters. All this fanboyism just sounds like people are insecure about their console of choice and have to brag about it. When did gaming become about everything other than the games?
I feel compelled to point out that I first began truly noticing low framerates in PS2 games (first console was Sega Master System II) and I wasn't a PC gamer at that point. In fact, I'd never even heard of framerates until I looked up why some games were jittering all the time when others were much more smooth. It was the same with aliasing (when I first played a PC game and found out what anti-aliasing was, I thought my prayers had been answered.)

When you make this all out to be about bragging or anything like that, you're missing the point entirely. If anything, you sound like you might have said something erroneous at some point and now you're just trying to stick to your guns. Maybe go take a breath and actually think about it. Try playing games at different framerates and you'll see that this whole argument is revolving around how games can be best played and enjoyed, and not some people trying to have a pissing contest like you're making it out to be.
 

Rozalia1

New member
Mar 1, 2014
1,095
0
0
Irick said:
Because they're not arguing this choice as a visionary choice. They're arguing it as an artificial limitation provided by the medium. As the medium (the engine) has priorly been used to provide a fluid experience in the same sorts of situation, this lazy difference _is_ insulting to the work that Carmack has done to optimize that engine. It _is_ insulting to see him flat out give up when he is given tools that someone spend decades refining to the express point to lower the difficulty in giving a fluid and uncompromised experience.

like i said, I can understand a cinematic argument. His fixed aspect ratio would even imply this is the case, but for them to come out, point to the consoles at the limiting factor and go "consoles suck, so we don't have to optimise for past 30FPS is dishonest, disingenuous, and not owning up to that fact is insulting to the hard work put into that game engine, not to mention the work put into the console design themselves.

This, as it has been communicated, is a lazy, not artistic choice. It is a choice to not do more work, despite how much has already been done for him.

So yeah. This isn't a rational reaction. This is me being very angry at someone throwing away decades worth of work because of their unwillingness to spend the time and effort to do it right.
Nothing is stopping them, there is no Shadow cabal ready to break their legs if they don't do it. He made the choice to make the experience the same across the board and that is a choice ultimately. One that should be just as respected as someone who edits a version until its possible on a platform that can't handle it (though that may be even more maligned by people on here even though you know...that choice means more people can play the game, having a sole version that would have existed anyway is apparently more important).

Its not dishonest, have they said they can't get 60 frames going on PC? Nope. If he came out tomorrow and said his game is going to be made up of blocks because that is what he wants would you accuse him of being dishonest/lazy for not having top graphics?
Frames are not going to be something he worries about when it comes to his vision, unless his vision strictly requires 60 frames to do of course but I question the validity of that claim by people itself. I mean Vanquish was 30 frames (well 25 often enough), and I've often read people mistake it for 60.
 

Skeleon

New member
Nov 2, 2007
5,410
0
0
EternallyBored said:
Eh, I never had any difficulty going back and forth, the most notable examples would be switching on the same FPS game from my PC to the console version at a friend's house, it's noticeable, but I usually stop noticing or caring after about 5 minutes of playing.
I really do. Especially if it's a fast-paced FPS like Painkiller or if reaction times are highly important. Now, I don't require 60+ fps necessarily, but it must be a lot more than 30 for me to not feel like the controls are sluggish and aiming is unresponsive. 45-50 fps is a minimum for me nowadays, something like that? Generally, the faster the gameplay, the more important precision is, the worse low fps feel. I don't play fighting games, but regarding FPS? If I have the choice between higher resolution/better textures and a higher fps-rate, I'll go for the higher fps almost every time. So, speaking of luxuries: To me, high-quality graphics are the luxury, while high enough fps are the important part.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Irick said:
They took John Carmack's engine...
John Carmack, the champion of low latency gameplay.
John fucking Carmack, the first real software engineer homegrown in the game industry.
JOHN, FUCKING CARMACK. HIS GAME ENGINE.
They took it, and decided to framelock at HALF the typical refresh rate?!
Jogn Carmack may have been a great programmer, but his engine was poorly designed. it locked game physics to framerate so badly that the engine did a vodoo resolution changes on the spot just to keep that 60 fps roling otherwise it would break the game. like literally it would lower your resolution during intense situations just to not break the physics that are locked to FPS.

they decided to lock it at half the speed, because consoles cant run it at 60, and the PC port suffers from it too.

Baron Teapot said:
Consoles are typically played on low-resolution televisions that are a few metres away. Especially on an old television, it's difficult to determine the jumping you get from 30FPS from that distance.
distance has en effect on resolution noticability, it has NOTHING to do with framerate noticability.

Baron Teapot said:
Our eyes can't detect greater than 60.
Pure and utter nonsense. US Army tested human eyes and determined that it can detect as much as 210 different images in within a second. they theorized it could detec more, just their equipment didnt go higher.

ninja51 said:
They're satisfied because PC's capable of playing new games even close to 30 fps are expensive and out of range for alot of consumers. A cheaper dedicated machine that can for the most part reliably always give 30 fps is an enticing option. Developers have to cater to that machine so most optimize it pretty well getting good graphics absent the potential of crazy unique to pc display issues from a lack of catering to specific pc hardware on the developers part. Its pretty clear to me.
ah, more BULLSHIT. PC capable of playing games at 1080p @60fps costs 444 dollars. If you want maximum settings then sure you are going to spend more, but lets not pretend that consoles get anywhere close to that.

Pseudonym said:
I'd like to ask a question to those who know more of these fps things. In a multiplayer game my ping is almost always above 17 meaning it takes more than a 60th of a second between me pressing a button and the server knowing I did. Much of the time it is above 33 ping turning that to a 30th of a second. Does it then still matter whether the FPS is 60 or 30? If the server doesn't yet know that something happened 2 frames after it happened then for gameplay-purposes those frames can hardly have mattered, right? Or does it work in another way?
Does not work like that. Lets say you got 33ms ping and 60 FPS. This means that you get lag from the server that lasts 33ms, thne you got another 17ms if delay till your screen changes if we were to assume absolutely no delay in processing of new data (there is one in reality but its unimportant in this comparison). now if you have 30 FPS, instead of 33ms +17ms delay you get 33ms +33ms delay. and trust me, that extra 17ms is very important. your reaction time may not be as fast, but if you react 17ms faster you win the match. there is a reason professional FPS playrs use 120+hz monitors. every milisecond helps a lot.

Jonathan Hornsby said:
How about the pages and pages of people in this very thread saying they can't tell the difference?
people who cannot tell the difference and arent just lieing about it have to seriuosly consider seeing an eye doctor, because normal human eyes CAN tell the difference.
The most likely case for said people here is that they havent compared the two and does not realize the difference when they play it because better is better and they dont always compare it to worst.


Danny Dowling said:
the one that bugged me the most was a YT comment where someone said games in 30 fps just hurt their eyes now... wow, guess they've waited a long time to watch TV etc if it just hurt their eyes.
a lot of people dont watch TV you know. and there is a medical condition where low refresh rates can induce nausea and headaches. this is true whether is a gaming, tv or just poor quality lighting. these people LOVE the 120hz monitors.



Jonathan Hornsby said:
Not if you can't see them.
but since that isnt a problem (EVERYONE can see it), then you agree?

Jonathan Hornsby said:
I weep for a generation so ashamed of its own intelligence that it refuses to even think for itself.
so stating something that is contrary to all physical evidence humanity has gathered so far is "common sense" and needs no proof. yeah good luck getting anywhere in life with that kind of thinking.

SmugFrog said:
Strazdas said:
there are websites that support native 60fps videos, but the name slipepd my mind now sadly.
Well, call me a believer... http://30vs60.com/

Even watching that though, if you only every played in 30 fps you wouldn't know what you're missing. It's just smoother - like having a better graphics card and better computer. What really convinced me is this one:
Well i actually meant regular video hosting sites supporting 60fps videos. but the comparison sites will, well, compare, as well.

There are no people who never saw it. unless they are blind when not gaming. real life runs on billions of FPS (seriuosly, everything runs on planck time, whitch is roughly 10^−43, altrough the smallest time we managed to measure is 12 attoseconds (1.2 × 10^−17 seconds). so you cannot claim people never saw higher refresh rate.

Even if you limit the argument to videogames specifically, its still flawed argument. There are people who never used a car, though lets not pretend that cars arent very useful. Ignorance is NOT an excuse.
 

Punkster

New member
Sep 9, 2014
15
0
0
If the games are fun then the frame rate is a minor issue for me.

Sure it would be nice to have it running a 60fps but that seems to be one of the issues with gaming these days, that so much people focus on such things.

Almost every Total Biscuit video has these gripes about frame rate and resolution and I get what he is saying but as long as the game is fun to play, why does it matter so much if the frame rate is locked at 30fps?

That doesn't mean that I am satisfied with 30fps.
 

CrystalShadow

don't upset the insane catgirl
Apr 11, 2009
3,829
0
0
Aaron Sylvester said:
(snip)

I really hope this isn't becoming a thing.

Easy way to find out if you can see the difference between 30fps and 60fps: http://frames-per-second.appspot.com
That's an interesting test.

Lots of variables though. To get a good sense of it's effect on games you have to turn off the motion blur.
One thing that's very apparent though, is that how much difference framerates make in an example like this is dependent on how fast an object moves.

This... Is not a very reliable test in some ways.

For a fast-moving object, shown at a 'framerate' higher than our eyes can percieve, motion blur will be automatic. (For the same reason that the analog image of a real-world object will be blurred)

Even if we can't tell two adjacent frames apart, our eyes will blur them together automatically, creating motion blur where there otherwise wasn't any.

Slow down the movement, in fact, and suddenly the apparent difference between 30 fps and 60 vanishes.

However, 30 fps is, experimentally the highest rate where we can clearly identify the content of individual frames. This makes it a borderline framerate in some regards.

Ironically, what these tests show me personally, is that whether framerates matter largely depends on what kind of content you are seeing...

Set the movement of objects to 2000 px/sec, and 30 and 60 frames a second are very hard to tell apart.
Similarly, at 100 px/sec, you'd be hard pressed to notice the difference. But, in the 500-1000 range, it's quite obvious if you have motion blur effects turned off. (at 60fps your eyes would blur it together). - With motion blur on, it largely stops being noticeable again...

What this shows is that it only really matters for scenes where there are fast-moving objects, and only if they are fast, but still within a range that a human being can reasonably process.

Outside those ranges it barely matters.
And with slow-moving objects (100 px/sec in the test program) it's pretty much impossible to tell the difference.
Similarly, with artificial motion blur effects added, the 60/30 fps difference also largely goes away.
Which is a big clue as to why it's possible to see a difference in games, but probably not in live-action film.

Motion blur as an artificial effect is cpu/gpu intensive to render, so that doesn't help any.

But really, this all basically just supports my previous intuitive understanding of this:

30 fps is fine for slow-paced games with little action/precision required. (You really won't notice the difference of higher framerates when things move slowly)

But for fast-paced action games which rely on reflexes and have fast-moving objects involved... You WILL notice the difference because higher framerates allow your eyes to create motion blur that is very obviously absent from a 30 fps shot of something...
This is only really noticeable with fast-moving objects.
(On top of this there are many input related things tied to framerates that are internal to games which also benefit, if quick response time matters.)

Artificial motion blur effects can largely negate the visible differences between 60 and 30 fps, but they won't negate the other aspects of how a game functions internally that DO in fact have an impact on games where fast reactions matter.

So... If I'm playing myst or something, or maybe an RPG, I probably wouldn't care or notice.
If I'm playing Quake, or Guitar Hero, or the like, where quick reaction times are a big deal, it's very likely quite a big deal.

Having said all that, games that artificially cap the framerate for no apparent reason are really kind of stupid.
It can be useful to have the option to cap the framerate on PC (if you multitask while gaming for some reason, this means the game isn't completely hogging your hardware when it doesn't need to.), but making that a non-adjustable option is silly.

(Of course, it does mean that if the game is capped to 30 fps but your computer can trivially run it at 60, you won't get lag spikes if the game is capped, but well-coded. Because, the limit is artificial, so even if it suddenly takes twice as long to draw a frame, the framerate will still be 30 fps, but it simply won't hit the artificial cap that drops the framerate if it can run faster than that.)

Does it matter?

The only real answer is: It depends.

(And I still remember old PC games where 12 fps was considered more than acceptable... XD)
 

Alcom1

New member
Jun 19, 2013
209
0
0
It seems that most people on this thread classify frame-rate among graphics which seems sensible. I don't, however. 30 FPS comes with a feeling, a distracting one, and acts as a negative multiplier for my gaming experience in a way that graphics don't. If I don't get 60 FPS, I will slash shaders, tear textures, melt meshes, and plow particles to reach 60 FPS. I will reduce graphic settings back to 1996 if it means achieving 60 FPS. The idea of sacrificing 30FPS for more shiny goes entirely against my preference, and the belief that 30 FPS is superior to 60 FPS is slightly aggravating.

The day I completed my current desktop PC over a year ago was the day I cemented my 60 FPS preference. My previous laptop PC wasn't able to reach 60 FPS with many of my favorite games, and sometimes couldn't reach 30 FPS, the minimum settings didn't go low enough. They day I tested all of my favorite games on my new contraption was a glorious one, and I was astounded by the difference it made when every single one of my favorite PC games ran at 60 Frames Per Second. This, the feeling of 60 FPS, was something that I wanted to experience with all games.

I can enjoy games at 30 FPS, but it will always detract and be a distraction.
 

Gray-Philosophy

New member
Sep 19, 2014
137
0
0
CrystalShadow said:
. . .

This... Is not a very reliable test in some ways.

For a fast-moving object, shown at a 'framerate' higher than our eyes can percieve, motion blur will be automatic. (For the same reason that the analog image of a real-world object will be blurred)

Even if we can't tell two adjacent frames apart, our eyes will blur them together automatically, creating motion blur where there otherwise wasn't any.

. . .
This is not entirely accurate, as far as I have learnt at least.

The average human eye is able to register about the equivalent of 24 FPS of movement. Our brain then automatically blurs out the image to adjust for any "missing frames", this is true.
However, this only applies to physical objects since light has to travel from a light source, to the object, reflect off its surface and into our eyes, where it is then processed by our brain.

Animations on a screen works differently from looking at moving physical objects though, and is not subject to the 24 "FPS" limitation and motion blur. The reason we can tell the difference in FPS in games and such is because there is no actual motion going on, the images on the screen are produced by still light being projected straight at us rather than light reflecting off a moving physical object.

Similarly, if we were to look at a moving light source we would likely be able to "track its movement" at a higher "FPS" than 24, because we're processing projected light straight from the source, and not reflected light.

Detecting differences up to and beyond the 60 FPS mark probably has a lot to do with adaptation though, explaining why some just don't see the difference and some do. It is entirely possible though, even up to and beyond 120 FPS.
 

Teepop

New member
Sep 21, 2014
25
0
0
I'm not a developer but to my untrained eye I find that not all 30fps games are the same (I own a pretty decent SLI rig by the way).

Destiny and Shadow of Mordor on my PS4 are great. They feel really smooth even after just playing CS at 144fps or Mordor at 60fps on my PC. Infamous running at a locked 30 felt great too.

Forza Horizon 2 feels incredibly smooth. I mean I can switch between Forza 5 and Horizon 2 and it doesn't even occur to me that I've dropped from 60 to 30.

Watch Dogs on PS4 actually feels smoother than my PC version running at 60 what with all its stutters and pacing issues.

AC4 on the other hand is a different kettle. I found that more jarring when coming from the PC version. I can live with it on PS4 but it was nicer at 60.

Meanwhile PS4 TLoU when using the 30fps mode was absolutely horrendous! I couldn't play it like that all.

If you want my honest opinion a lot of PC users wrongly exaggerate the problems with 30fps because they are mistaking it for other issues. I bet you most PC users who buy a games console or go round a friends house to play do not realise that they have massive input lag due to the TV not being in PC or Game mode.

Sure everyone here will come out and "claim" that they switched the TV to PC or game mode (or connected to the console to their monitor) but let's leave the realm of fantasy and be serious yeah? I bet you most people moaning about 30fps console games are using a TV and are not considering the display lag. Compare like with like. Don't play at 60fps on a PC monitor and then 30fps on a TV with all the dynamic enhancements turned on. Hardly surprising that the controls feel laggy and horrible!

The other thing of course is that they play on consoles with an unfamiliar control method that further adds to their frustrations.

When it comes to locking to 30fps on PC, well I've tried that in the past for the kicks and it never feels the same as 30fps on a console. It feels a lot worse.

Ultimately halving the frame rate frees up a lot of power for prettier graphics or more NPC's, more cars in a race etc. If the 30fps that they deliver feels like Mordor, Destiny or Forza 2 then I am quite happy with that method of squeezing more out of what is low cost hardware after all.

FWIW I can see the difference between 60 and 144fps but in all honestly I feel no loss when playing at 60 and unless you are a pro gamer I honestly don't see the tiny difference being worth it. Better to use your power for a higher resolution IMO. The nice thing about PC is having that choice where as on console the devs make the choice for you.
 

CrystalShadow

don't upset the insane catgirl
Apr 11, 2009
3,829
0
0
Gray-Philosophy said:
CrystalShadow said:
. . .

This... Is not a very reliable test in some ways.

For a fast-moving object, shown at a 'framerate' higher than our eyes can percieve, motion blur will be automatic. (For the same reason that the analog image of a real-world object will be blurred)

Even if we can't tell two adjacent frames apart, our eyes will blur them together automatically, creating motion blur where there otherwise wasn't any.

. . .
This is not entirely accurate, as far as I have learnt at least.

The average human eye is able to register about the equivalent of 24 FPS of movement. Our brain then automatically blurs out the image to adjust for any "missing frames", this is true.
However, this only applies to physical objects since light has to travel from a light source, to the object, reflect off its surface and into our eyes, where it is then processed by our brain.

Animations on a screen works differently from looking at moving physical objects though, and is not subject to the 24 "FPS" limitation and motion blur. The reason we can tell the difference in FPS in games and such is because there is no actual motion going on, the images on the screen are produced by still light being projected straight at us rather than light reflecting off a moving physical object.

Similarly, if we were to look at a moving light source we would likely be able to "track its movement" at a higher "FPS" than 24, because we're processing projected light straight from the source, and not reflected light.

Detecting differences up to and beyond the 60 FPS mark probably has a lot to do with adaptation though, explaining why some just don't see the difference and some do. It is entirely possible though, even up to and beyond 120 FPS.
I'm saying this based on the results of experimenting with the program the OP linked, more so than anything else.
It was a simplification, not an attempt to be 100% accurate with every little detail.

But let's consider a few cases based on the actual science, if you like.
We have a limit to what we can consciously perceive, which most tests have determined to be a change of 1 frame in 30.

But our eyes don't really have a 'framerate', they operate continuously (there's other consequences of the chemical process involved, such as exposure to bright light can temporarily render individual receptors effectively blind, but that's not directly relevant.)

Where the 'real world' is involved, objects are in continuous motion, and giving off light the whole time.
If an object moves too fast for us to perceive clearly, since we never stop receiving light, the result is a blur over a large region.

When we look at old-fashioned film stock, this is being projected at 24 fps, but it takes physical time for the projector to advance the frame. Typically, what the projector does is blank the frame during the transition (with a physical barrier.)
This makes each frame a distinct thing with a blank (dark) period during the transition.
But since people can typically identify what's going on at such low framerates, there's no blur to it, and we may even be able to spot the 'blank' period (causing the image to flicker)
Newer film projectors play a little psychological trick here though. The film stock used is still only 24 fps, but rather than blank it on the transition, the frame is shown, blanked, then the same frame is shown again, before blanking a second time and transitioning to the next. This makes the 'blank' periods shorter, and created an effective framerate in some sense of 48 fps... But the film is still only 24 fps, so you can still pick up on individual frames. What it has done, is make the apparent flickering go away.
(A film of a real-world scene will probably contain motion blur on the film frames itself, owing to how a film camera functions, but that's another issue)

Still, 24 fps is too low to be saying anything about motion blur.

But to describe the effects of computer displays at high framerates requires considering what kind of technology the display uses.

Consider old-fashioned television broadcasts, and the old style CRT televisions for a moment.
These displayed either a 50 hz or 60 hz interlaced signal (depending on the TV standard used)
Does this cause the eye to create motion blur?
Well, here, the display itself confuses the issue greatly.

What a CRT display does is scan an electron beam across a screen of phosphors. The phosphor glows, and this glow takes a while to decay. Only one point is actually being scanned at any one moment, and the 'framerate' is simply a measure of how often the display can draw a complete frame. It is at pretty much no point however, NOT updating the display.
At any given point it almost always partway through drawing a frame, but the phosphors glow for long enough to make this imperceptible.
TV's were interlaced displays though, which means they scan one set of lines on one frame, and the alternate lines on the next. While this seems like it would leave gaps, in reality it blurs the two adjacent frames together, (typically the phosphors activated by the first frame are still glowing somewhat for the second.)
This innately blurs two adjacent frames together.
At no point is the display not giving off light, but of course, the image does transition in a way that doesn't resemble the real world. (The transition isn't instant, and happens at a single point scanning across the image)

For non-interlaced displays of course, this direct blurring of two frames doesn't happen, but phosphor decay rates still imply adjacent frames inherently blur together a bit.


But... Who still uses CRT displays, right?
Well, LCD's are a lot worse. 'Framerates' are an even more ambigious concept on an LCD, because it takes time for a pixel to transition from one state to another.
On top of that, the time taken varies (but not in a straight-forward linear manner) depending on what kind of transition it is... Going from pure white to pure black takes a different amount of time to going from an intermediate shade to another intermediate shade.

On old LCD's this was VERY obvious, leading to what was called a 'ghosting' effect, which bears some resemblance to motion blurring effects, but tended to be very irritating more so than anything else.

It's worth noting that LCD displays also scan the image one pixel at a time from the top-left, to bottom-right. This method of updating the display is what can create the tearing effect if a game is rendering frames faster than the display can show them. (The display will be part-way through drawing a frame, then suddenly the content of the frame buffer changes, and the rest of what's drawn is the next frame.)
There is actually no technical reason for an LCD display to update this way, it's merely a legacy of them being built on top of display systems designed for CRT monitors, but that's the way it works.

The main consequence here is if the 'framerate' being shown is very close to the limits of the LCD (which is most of the time, because why run a display at much less than what it's capable of), there will, again, be some blurring of adjacent frames due to the time it takes to transition any individual pixel.
The effect is somewhat different to a CRT display, but still has the approximate effect of blurring together adjacent frames, though the transition is typically slower, and thus smoother than on a CRT. (Not nessesarily a good thing)

The display technology clearly influences things here, but that's not quite all.

Looking at just the eye alone, there is indeed an obvious difference between an image on a display of a moving object, and an actual moving object. This is true whether you're talking about film, TV, computer games or anything else that is being shown on a screen somehow. (There is a difference between recordings of reality, which introduce blurring in the camera/recording equipment, and artificially created animation, eg games, CGI or even hand-drawn animation, where any such blurring has to be added deliberately)

Assuming computer generated images without motion blur, each frame will show a distinct, static image.
If a person perceives 30 frames a second, but the animation is at 60, there will be 2 frames drawn for each one a person is actually aware of. The overall effect of this is (roughly) that both frames are blurred together and seem to be one thing.

Is this different from reality? Absolutely. Reality is continuous, so there will be an image including all intermediate states between any two positions of a moving object. Functionally this is basically equivalent to having an infinite framerate.

With an artificial image, there will always be discrete, distinct positions even if you blur multiple frames together.
Whether this looks different to 'real' motion blur depends on how fast the object is moving, how large it is relative to the resolution of the display (If an object moves only 1 pixel in single frame, it'll make no real difference compared to seeing a real moving object).

Anyway, it's true, there is a difference. 'motion blur' of a computer generated image running at framerates higher than the eye can perceive will look like a bunch of distinct positions blurred together, while in the real world it'd look like one continuous smear of light across the whole range of motion.

Strictly speaking it's still motion blur either way. But of course, on the artificial image shown on a screen, it's not motion blur of the objects depicted onscreen, but rather 'motion blur' of the screen itself. (You are adding 'motion blur' to the image frames themselves, rather than the objects in the scene, which is what should be happening.)

It's different, yes, but nonetheless, the effect of exceeding the 'framerate' of the eye is to create a motion blur effect on moving objects. How convincing this looks is dependent on how fast the objects are moving, the resolution of the display, and the actual framerate of the display...
This is only perceptible as different from real-world motion blur if the object is moving too quickly. Otherwise it's near enough to the same effect to go unnoticed.

Still, all of this does show an interesting perceptual reason for why you can tell the difference looking at framerates over 30 fps even though you can't technically perceive what's going on...

Which was altogether too many words to say "you are technically correct, but within certain limits the distinction you're making doesn't matter"

...Why do I do this again? XD
 

Adam Jensen_v1legacy

I never asked for this
Sep 8, 2011
6,651
0
0
I wanted to get The Evil Within. Now I won't bother with it. I have no problem running the game at 30 fps if my hardware isn't good enough to run it at 60, but I won't support anyone who limits the fps on purpose.
 

Gray-Philosophy

New member
Sep 19, 2014
137
0
0
CrystalShadow said:
Gray-Philosophy said:
This is not entirely accurate, as far as I have learnt at least.

The average human eye is able to register about the equivalent of 24 FPS of movement. Our brain then automatically blurs out the image to adjust for any "missing frames", this is true.
. . . snip
I'm saying this based on the results of experimenting with the program the OP linked, more so than anything else.
It was a simplification, not an attempt to be 100% accurate with every little detail.

But let's consider a few cases based on the actual science, if you like.

. . . snip . . .

Which was altogether too many words to say "you are technically correct, but within certain limits the distinction you're making doesn't matter"

...Why do I do this again? XD
Could be for the same reason I believe I did it, we like to enlighten with what we know regardless of how relevant it might be. You definitely have some good points about a lot of things I haven't considered. It gives some food for thought :)

I'm also not sure I have a rebuttal, we're approaching subjects I don't know enough about to make any certain statements.

Either way though, to get back on topic. I still find a higher framerate more visually pleasing, and I feel it improves my control in shooters significantly.
 

crazygameguy4ever

New member
Jul 2, 2012
751
0
0
honestly.. as a console only gamer I don't care about a games's frame rate as long as it looks decent.. until about 4 or 5 years ago I didn't even know what framerate was in fact.. I usually don't worry about framerate since a game's fun factor is what actually counts when playing a game, not framerate