The Great Framerate Debate

Shamus Young

New member
Jul 7, 2008
3,247
0
0
The Great Framerate Debate

Shamus jumps into Xbox One vs. PlayStation 4 discussion on framerate vs. resolution.

Read Full Article
 

Falterfire

New member
Jul 9, 2012
810
0
0
Whelp. After reading the description in the article and carefully studying both of the pictures, I'm having problems noticing the difference. Certainly I wouldn't have noticed if the two images were used separately on pages one and two.

I guess it's time to turn in my gamer card, grab a Filthy Casual Peasant burlap tunic and slink away into the night.
 

senkus

New member
Apr 3, 2010
27
0
0
Shamus Young said:
Note the uneasy case where (for whatever reason) the game runs at something oddball like 40fps. You draw a frame and show it. Then you're only two-thirds of the way done when the refresh comes, so it repeats the previous image. Then you finish an image, but the refresh isn't ready yet. Then when the refresh happens the current image is a bit old, but the new one is only one-third done. Then on the next cycle the frame and the refresh are in sync again. The result is this strange stutter-step where it feels like the game keeps shifting between 60fps and 30fps.
Is this with or without any V-Sync modes on?
Falterfire said:
Whelp. After reading the description in the article and carefully studying both of the pictures, I'm having problems noticing the difference. Certainly I wouldn't have noticed if the two images were used separately on pages one and two.

I guess it's time to turn in my gamer card, grab a Filthy Casual Peasant burlap tunic and slink away into the night.
Have another look in full-size: http://cdn.themis-media.com/media/global/images/library/deriv/737/737897.png
 

Clovus

New member
Mar 3, 2011
275
0
0
Falterfire said:
Whelp. After reading the description in the article and carefully studying both of the pictures, I'm having problems noticing the difference. Certainly I wouldn't have noticed if the two images were used separately on pages one and two.

I guess it's time to turn in my gamer card, grab a Filthy Casual Peasant burlap tunic and slink away into the night.
Don't look at Rainbow Brite; focus on the brown background. You get weird banding in the bottom image. This would be much, much more pronounced full-screen and in motion. Those bands jumped all over the place; it was pretty ugly. The difference between 720p and 1080p is often pretty subtle; but video of this is pretty bad.
 

TiberiusEsuriens

New member
Jun 24, 2010
834
0
0
There is yet one more compounding factor to the debate: We have gotten to a point in time where increased graphics resolutions and textures give us increased diminishing returns (ie, spending tons of memory and CPU for tiny-winy boosts) and all people ever talk about are how important those tiny boosts are, but ignore the truly groundbreaking GPU additions.

The biggest changes that we will ever see in graphics engines in the near future are post processing effects. This includes fancy lighting, reflections, smoke, particle effects, anti-aliasing, ambient occlusion, tessellation... terms that most gamers may have heard of but never knew what they were (because no single effect is a GIGANTIC change). These are the features that set apart the true 'next-gen' game engines such as CryTech's CryEngine. These effects are also at the root of modern major gaming news such as the Watch_Dogs downgrade.

CryEngine 3 can give a medium-hardware machine such a pretty picture not because it has high resolution or fps, but because every single element mentioned above was considered in the engine's design from the beginning, given the devs time to optimize rendering for them. Watch_Dogs got downgraded because the engine was designed to print a basic image to the screen at solid fps. They then put in some of these and cranked it up to 11 to see how pretty it could be at E3, only to realize months later that the engine was never designed to handle all of them at once. You can only optimize for them so much when you don't take these seriously in to account - when they aren't included in the ground floor. These features are a tiny after thought when they should be integral, and not a single one of them is a gamebreaker by itself.

The same happened with Assassin's Creed 4: it was built with the Screed 3 engine and had some pretty lights tacked on that it was never meant to handle (Screed 3 was based off of Screed 2, etc..). The result: turning god rays on in Black Flag tanks your fps. While Ubisoft is known for having terribly optimized games, this is a broader trend with most companies. Tomb Raider trumpeted TressFX for Laura's hair, but how did the game handle it? The result: turning TressFX on in TotalBiscuit's mega rig tanked his fps by over half. The new inFamous ran at 30fps but still looked better than many games that can run 60, and it was because they dedicated early development to lighting and reflection. The game had some issues, but very few people complained about the game not looking good.

CryEngine 3 is considered one of the truest "next gen" engines, and it was released FIVE YEARS AGO. Just sayin'.
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
Something I remember about the old 3DFX cards is that on some of the later cards (Voodoo 3 I think) the processor worked at 24 bit internally and dithered the output. So you did not actually see banding unless the textures were terribly designed.

Also, 40 fps does not look weird. It looks smoother than 30fps, and less smooth than 60fps. That the frames are visible for varying lengths of time does not matter. The eye cannot make out the individual frames, it just sees that there are more of them. Fluctuating fps is only noticeable when, for example, you go from a corridor into a big room and your framerate drops from 60 to 30.

And something that should be mentioned regarding framerate. It's a pity Youtube is stuck at 30 fps. This means that a 30 fps game will always look better than it would at 60 fps. And that's 90% of what people will look at before buying a game. If everyone went to Twitch instead, which does 60 fps streams, I think a lot of devs would target 60 fps instead.
 

Pyrian

Hat Man
Legacy
Jul 8, 2011
1,399
8
13
San Diego, CA
Country
US
Gender
Male
senkus said:
Shamus Young said:
Note the uneasy case where (for whatever reason) the game runs at something oddball like 40fps. You draw a frame and show it. Then you're only two-thirds of the way done when the refresh comes, so it repeats the previous image. Then you finish an image, but the refresh isn't ready yet. Then when the refresh happens the current image is a bit old, but the new one is only one-third done. Then on the next cycle the frame and the refresh are in sync again. The result is this strange stutter-step where it feels like the game keeps shifting between 60fps and 30fps.
Is this with or without any V-Sync modes on?
Um, yes? Turning V-sync off will not affect the refresh rate of the monitor. It will, however, allow the frame to change in the middle of a screen refresh, causing "tearing", where part of the screen is showing one frame and the rest of the screen is showing the next frame. ...I hate that.
 

90sgamer

New member
Jan 12, 2012
206
0
0
It's a shame that your columns are shuffled off the main page by The Escapist, Shamus. As for this article, the body of it doesn't support your conclusion, which I thought strange. In any case, I don't see how your analysis shows it to be subjective. If anything, it only supports the conclusion that some people are better equipped than others to make a judgment about graphical fidelity. If this article was about art, then your conclusion would have been that which artistic medium is best is subjective-- in part-- because some people viewing the art need corrective lenses.
 

rofltehcat

New member
Jul 24, 2009
635
0
0
It probably also comes down to what you are used to. I have a pretty big tolerance for low frame rates. I normally notice the difference between 720p and 1080p, I just don't really care as much. I still find high frame rates and higher resolutions to be simply better and more enjoyable but I don't think it is that big of a deal. But then again I grew up without high frame rates or even high graphics settings and couldn't even run high resolutions if I wanted to because our screens were always pretty bad.

In the end style also plays a huge role. If the art style is done well, the game can run on a low resolution and if the game is designed around it it can even run at low frame rates. It can still be enjoyable and visually very pleasing.
If the art style isn't designed to deal with a lower resolution or if the gameplay isn't designed around allowing low frame rates, it can still diminish the game. For example I don't think 30 fps 720 p is particularly suited for fast paced multiplayer shooters that aim for getting as close as possible to photo realism. As long as the frame rate doesn't dip below 30, I'll still prefer them on the PC. The only saving grace for them on consoles is probably the mentioned higher distance from the screen.
They are still playable imo, as I said my tolerance is pretty high, and if I had a console and no PC to adequately play it on, I'd still get it for the console. Single games (or even a few) aren't really worth investing into an additional platform.

I wonder if there are noticeable differences in generations. E.g. ~40 year olds I(grown up with uglyTV and very abstract graphics) having a higher tolerance than ~25 year olds (grown up with acceptableTV and bad graphics) having a medium tolerance and the young folk (grown up with HD-TVs, good graphics) puking their guts out when things get jumpy.



90sgamer said:
It's a shame that your columns are shuffled off the main page by The Escapist, Shamus.
Uhm, isn't it on the main page? Over by the right hand side where also Yahtzee's, Bob's and whoeverdoescriticalintel's articles are? The main site presents the article twice to me at the moment. One medium sized one is directly above the news window and the smaller one is over on the right side where all the articles go. Or do you mean the banner thingy? The space on there seems very limited. Depending on the topic they could maybe push it into the science&tech banner. This article certainly has a very large tech aspect and is certainly very informative about how display tech works/worked.
 

Xeorm

New member
Apr 13, 2010
361
0
0
rofltehcat said:
90sgamer said:
It's a shame that your columns are shuffled off the main page by The Escapist, Shamus.
Uhm, isn't it on the main page? Over by the right hand side where also Yahtzee's, Bob's and whoeverdoescriticalintel's articles are? The main site presents the article twice to me at the moment. One medium sized one is directly above the news window and the smaller one is over on the right side where all the articles go. Or do you mean the banner thingy? The space on there seems very limited. Depending on the topic they could maybe push it into the science&tech banner. This article certainly has a very large tech aspect and is certainly very informative about how display tech works/worked.
I don't know anyone else, but when I'm on this site I'll routinely check the main banner on the front page and scroll through that, then check down to see the news. Anything else on the screen is random crap that I'll maybe notice if I'm paying attention.

They've relegated this column to the random crap section, so I don't always notice it as quickly as I otherwise might, which is unfortunate.

Bad Jim said:
Also, 40 fps does not look weird. It looks smoother than 30fps, and less smooth than 60fps. That the frames are visible for varying lengths of time does not matter. The eye cannot make out the individual frames, it just sees that there are more of them. Fluctuating fps is only noticeable when, for example, you go from a corridor into a big room and your framerate drops from 60 to 30.
Careful about saying what the eye can or can't make out. For quite a few people, sure, they won't be able to notice any sort of difference, but for others they might, especially if the effect is called out to them. It's likely a small thing either way, but to be honest small things are often the worst. They nag at your brain and tell you "Something is off about this" but your conscious brain won't be able to mark down why, causing some dissonance.
 

Imperioratorex Caprae

Henchgoat Emperor
May 15, 2010
5,499
0
0
I've always held this viewpoint, because it doesn't matter the FPS or refresh rate. I can't honestly top this well written viewpoint so I'll defer to Mr. Young (man thats hard to type because my private High School principal who was responsible for me getting kicked out over bs reasons was also named Mr. Young).
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
Falterfire said:
Whelp. After reading the description in the article and carefully studying both of the pictures, I'm having problems noticing the difference. Certainly I wouldn't have noticed if the two images were used separately on pages one and two.

I guess it's time to turn in my gamer card, grab a Filthy Casual Peasant burlap tunic and slink away into the night.
Part of the problem for me is that I can notice the difference but I'm hard pressed to care. I heard Watch Dogs was being compared to GTA V and thought "great!"

I should probably grab a tunic, too.

Xeorm said:
Careful about saying what the eye can or can't make out.
He said the eye can't make out individual frames at that frame rate, and it's true. The human eye can only select a few discrete images in a second. He also said that you will notice more or less.

The closest to a contestable part was his use of an example. Even that's true if taken as an example, not as the only rule.

I'm also not 100% sure the people claiming they can see the difference on higher frame rates really can. I used to have access to a studio with high end equipment, and the people who claimed they could hear the difference between analogue and digital or between higher-quality MP3s and lossless were full of it. This wasn't the most scientific of experiments, but it's close enough for this level of discussion.

I'd be surprised to find it was any different with high frame rates.
 

michael87cn

New member
Jan 12, 2011
922
0
0
I started gaming in the 8bit era so I'm fine with anything. If it's fun that's what matters. IMO atm the most high-detailed high res high framerate video game character looks incredibly fake to me still. So they could have 'drawn' it as a sprite and I would still have enjoyed the game if it was fun.

In fact I love sprites, more games should use them.
 

Robyrt

New member
Aug 1, 2008
568
0
0
Zachary Amaranth said:
Xeorm said:
Careful about saying what the eye can or can't make out.
He said the eye can't make out individual frames at that frame rate, and it's true. The human eye can only select a few discrete images in a second. He also said that you will notice more or less.

The closest to a contestable part was his use of an example. Even that's true if taken as an example, not as the only rule.

I'm also not 100% sure the people claiming they can see the difference on higher frame rates really can. I used to have access to a studio with high end equipment, and the people who claimed they could hear the difference between analogue and digital or between higher-quality MP3s and lossless were full of it. This wasn't the most scientific of experiments, but it's close enough for this level of discussion.

I'd be surprised to find it was any different with high frame rates.
Some differences are small enough that they can be detected only with a proper experiment. I can detect a 10ms audio/video lag when playing Rock Band, like most musicians, but there's no way I could tell if someone added 10ms of audio lag to an ordinary microphone. Similarly, it's easy to tell 16-bit color depth from Shamus' example, but I played Dark Souls for like 10 hours before noticing that I had accidentally set the fog effects to 16-bit color depth. (And this is a game with plenty of fog.)
 

Bigeyez

New member
Apr 26, 2009
1,135
0
0
Frame rate is one of those things that as long as its above 30 FPS it looks fine to me. I can only tell the difference between 30 and 60 FPS if a game is pingponging between them. As long as the game is running steady I can't tell the difference. However my eyes are not as good as they used to be.
 

Kenjitsuka

New member
Sep 10, 2009
3,051
0
0
No mention of G-sync and Freesync?
You should really add them, perhaps talk about Nvidia's Frame Capture Analysis Tool (FCAT) as well!!! :\
 

Redlin5_v1legacy

Better Red than Dead
Aug 5, 2009
48,836
0
0
michael87cn said:
I started gaming in the 8bit era so I'm fine with anything. If it's fun that's what matters. IMO atm the most high-detailed high res high framerate video game character looks incredibly fake to me still. So they could have 'drawn' it as a sprite and I would still have enjoyed the game if it was fun.

In fact I love sprites, more games should use them.
So much this. While personally I don't begin to really get bothered by a loss of frame rate until it hits 16fps, I don't require every game to push the limits in that aspect. I am far more interested in the art style, narrative, game mechanics, etc.