Why is 60fps so hard for Developers?

Recommended Videos

Ironbat92

New member
Nov 19, 2009
762
0
0
Recently, this article hit yesterday with Naughty Dog talking about how hard it is for the studio to deliver 60FPS 1080p for Uncharted 4. I'm not a tech head and I sure as hell don't know dick about game design, but I just wanted to know why is it so hard? Why is t so hard to get a game running at 60FPS? I'm not even talking about P.C gaming doing it for years, but even games back on the PS3, 360, and Wii, even games on PS2, Xbox, and Gamecube, could get 60FPS. Again, not a tech head or anything like that, and I'm not trying to come off as a jerk or anything like that. I'm just curious.
 

Dirty Hipsters

This is how we praise the sun!
Legacy
Feb 7, 2011
8,783
3,362
118
Country
'Merica
Gender
3 children in a trench coat
It's not so much that getting it to run at 60 fps is hard, what's hard is getting it to run at 60 fps while making the game's graphics look really "next gen."

60 fps can't be seen in advertisements, it's not something that comes across in screenshots, and trailers are usually shown at 30 fps or less so the priority for most companies is to make their game look as good as possible and then worry about getting it to run at a stable framerate. Trying to tweak the game to get it to run at 60 fps is usually one of the last things that are done, once everything else in finalized and yes, under those conditions it's incredibly hard to get the game to run at 60 fps when they've already all but maxed out the hardware just getting it to run at 30.
 

blackdwarf

New member
Jun 7, 2010
606
0
0
Graphics versus performance. Graphics sells, so of course they choosing that over performance. And unlike what most people say, the majority of the consumers don't care if the framerate is 30 or 60, so long it is consistent and doesn't dip below 30.

and of course making a game that runs well takes a lot of time. We have seen games that barely reach 30 FPS even though the game itself looks visually really bad with low resolutions and stuff, but if the developer doesn't invest the time to make its product works, then the result will be a bad performance.
 

Supernova1138

New member
Oct 24, 2011
408
0
0
The new consoles simply aren't powerful enough to deliver 'next gen' level graphics at 1080p and 60FPS, no matter how much optimization they do. The PS4 has an upper midrange GPU from 2012, the Xbone has a lower midrange GPU from 2012. While those GPUs can play last gen games at 1080p 60FPS no problem, they struggled to mainatain 60 FPS with newer and more demanding titles that came out before the PS4 and Xbone hit eg. Battlefield 3, Witcher 2, and Metro Last Light without dropping visual quality.

Framerate doesn't matter for screenshots, most gameplay videos are only 30FPS, Most console gamers simply don't care because they played most of their games at 30FPS last gen, and they play with controllers which helps hide the sluggishness of 30FPS. So it's pretty much like last gen, don't expect to get 60FPS in most titles unless you're playing on PC, and even then you might not get it if the port is utter shit *cough*AssCreed Unity *cough*
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
It's not really about consoles not being powerful enough. Even if a console existed that was a million times more powerful than the most extravagant PC gaming rig, developers would still have to decide between having an extremely nice looking game at 60fps or an even nicer looking game at 30fps, and many devs would still go for 30fps.

Part of it, I think, is that Youtube etc has traditionally only supported 30fps, so games running at higher framerates will not look any smoother and will be less pretty. Hopefully the recent addition of 60fps support will change that.

But there is also the fact that you can't tell how good the controls are in videos. You might be watching a good player who plays well in spite of bad controls, or a bad player who sucks even with good controls. It's hard to say whether 60fps actually matters until you play.

There is also the fact that framerate isn't important in all games, and it is less important with controllers. With mouse aiming you can turn around very fast and this feels a lot nicer with a high framerate. With the right analogue stick, it is not so important.

But I do think it is a bit silly, because even when my PC was less powerful than a PS2 I would still turn down graphical detail to get up to 60 fps.
 

Soviet Heavy

New member
Jan 22, 2010
12,218
0
0
Because the console market vastly overstated the abilities of the next-gen games while still using suicide factories to mass produce their products.
 

Supernova1138

New member
Oct 24, 2011
408
0
0
TopazFusion said:
So let me get this straight, we're just over a year into the current generation of consoles, and the hardware already sucks ass and doesn't deliver what was promised? What the hell?
At this rate they're going to have to replace this generation of consoles MUCH sooner than they did the last. This generation is going to have nowhere close to the same longevity as the last generation had.
The problem is that The Xbox 360 and PS3 had top of the line GPUs put in them when they came out, making them quite expensive to make, but Sony and Microsoft ate the loss and hoped to recoup it by dragging the generation out for a decade and raking in the licensing and subscription fees. This time Sony and Microsoft didn't want to rely so heavily on the loss-leader model and put midrange hardware into their new consoles. This made them cheaper to make and improved profitability, but they're simply not as capable as developers hoped, and aren't going to last nearly as long as the PS3 and Xbox 360 did. I expect we'll be seeing a 9th generation in 2018 or 2019 at the rate things are going. In a couple of years I wouldn't be surprised if games are back down to 720p and 30 FPS just to keep them playable on the current generation hardware.
 

CrystalShadow

don't upset the insane catgirl
Apr 11, 2009
3,829
0
0
ShadowRatchet92 said:
Recently, this article hit yesterday with Naughty Dog talking about how hard it is for the studio to deliver 60FPS 1080p for Uncharted 4. I'm not a tech head and I sure as hell don't know dick about game design, but I just wanted to know why is it so hard? Why is t so hard to get a game running at 60FPS? I'm not even talking about P.C gaming doing it for years, but even games back on the PS3, 360, and Wii, even games on PST, Xbox, and Gamecube, could get 60FPS. Again, not a tech head or anything like that, and I'm not trying to come off as a jerk or anything like that. I'm just curious.
To put it bluntly, because devs have other priorities.

It's not that it's hard, it's just that it's a tradeoff

Basically, it works like this:

Framerate + Resolution + Graphical effects quality + Detail of models & environment = performance of system

If you increase any of those, one or more of the others has to go down. What devs are really saying when they say they can't do 60fps is that they think the other stuff (particularly the 2 things on the right) are more important than a high framerate.

Basically, devs think shinier graphics > higher framerate

That's it.

It's a choice. And framerate appears to be the loser every time.
 

AT God

New member
Dec 24, 2008
564
0
0
I have been spoiled by PC gaming, 60fps has been the standard for me for over a decade now.

That said, I personally cannot tell when a game is at 60 vs when it is at 30, the only time I do notice FPS is when it suddenly changes. For instance, playing Battlefield 3 on my old video card would cause random fps spikes when things got hectic, I ran fraps to check how bad and it fluctuated from 60fps down to 20 at some parts, but I usually noticed if it dropped from 60 to 45. However, a few weeks ago I played through the South Park RPG entirely without any issues, and then later learned when reading about review of the game that it is locked to 30fps on PC. I had no clue and wasn't at all aware of it.

That said, what was the point of the new consoles if they can't get 60FPS and have modern graphics? I always felt that when the new set of consoles came out, they at least hit the standards for PC gaming at their time of release. PCs continue to grow in strength as tech increases, and then a new set of consoles is released years later and catches up, rinse and repeat. But it seems like these new consoles haven't even caught up to PC this time, they are like a year old and they are already reporting all of these shortcomings that I dont hear about on PC versions (unless due to bad porting, like locking 30fps for games).

I am honestly surprised this generation of consoles isn't more modular, like how the N64 had that expansion pak slot that allowed increased RAM later in the consoles life. I always felt that was an oddly insightful move on Nintendo's part, one that hasn't been remembered by them however.
 

CpT_x_Killsteal

Elite Member
Jun 21, 2012
1,519
0
41
Because the consoles are weak. That's all their really is to it. They often can't have both, so they choose the shiny graphics over resolution and framerate, since it usually sells better. I don't blame them for it or anything, it's just that the consoles aren't really "next gen". As was said from square one, they're all-in-one contraptions. They weren't designed solely for gaming in mind, if they were, they'd probably be stronger (and actually worth it over a similarly priced computer).
As for why previous generations could hit 60FPS, they weren't very graphically intensive, thus they could hit the 60FPS mark, even with lower grade hardware.

That's not to say framerate is the all important factor in a game, it's certainly a strong point, but it doesn't make or break anything (as long as it's stable and at least 30fps.)
Then you've got the people going on about 30fps being "better" or "I prefer it" or "it's more cinematic" which is absolute rubbish. Not to mention the "the human eye can't distinguish passed 24-30fps" PR tripe.
 

Aiddon_v1legacy

New member
Nov 19, 2009
3,672
0
0
TopazFusion said:
So let me get this straight, we're just over a year into the current generation of consoles, and the hardware already sucks ass and doesn't deliver what was promised? What the hell?
At this rate they're going to have to replace this generation of consoles MUCH sooner than they did the last. This generation is going to have nowhere close to the same longevity as the last generation had.
Welcome to what happens when PC devs were allowed to take over consoles: they forgot to be efficient. Oh, their games COULD be more fun and original than they are regardless of hardware. But that would require them to be a lot more skillful than they are and excise the more shallow and trite elements out of their titles.
 

Dizchu

...brutal
Sep 23, 2014
1,277
0
0
I actually think that a 30 FPS cap on console games isn't even a big deal. And that's not because I'm a PC gamer that doesn't care about the standards console players have to deal with, I think those that game primarily on consoles have a different set of expectations.

When you're playing on your PC you (typically) have a precise keyboard and mouse setup, a HD display (standard PC monitors have been HD since before HD televisions became popular), and a more "involved" feel. You're generally not leaning back on a sofa across the room from the action, you're at a desk a couple of feet away from the display. You can see the pixels, you can sense the responses that minute mouse movements make, it's a lot more precise.

With precision and scrutiny comes the demand for the most optimal feedback. 30 FPS might cut it for console games but on an FPS played on a PC, generally with a lot of camera movements, it is merely tolerable.

On the other hand, console games have traditionally had more of a disconnect between game and player. You're there with the controller specifically designed for the console, a controller that is sometimes needlessly cumbersome for many games (see the Wii and Wii-U and for the most frustrating modern example, the Kinect). Consoles are also advertised to be used by multiple people in the household, and are usually placed in the living area where others can spectate or join. PCs (as their name implies), are for more specialised, personal use.

So while 60 FPS is objectively better and people should quit arguing that low frame rates are "cinematic", console games simply aren't scrutinised as much. 30 FPS can be acceptable and those used to the relatively imprecise controller inputs and distance from the screen probably won't see much of a difference, especially when the game tastefully uses an effect like motion blur.

Besides, don't pretend like our PS1/PS2 games didn't chug when there were lots of things on screen.
 

Maximum Bert

New member
Feb 3, 2013
2,149
0
0
Because as others have said theres a trade off and thats on anything PC or otherwise its just the games are made for consoles a lot of the time and they want it to look as good as possible so they sacrifice framerate to do that after all graphics are immediately seen by everyone but framerate much less so and unless it ridiculously low or erratic as hell not felt by many either.

As for the specific case with Uncharted 4 I reckon they can get away with 30 fps comfortably its a story driven game with a lot of action but not particularly high paced (going from last installments). There comes a point of diminishing returns and it differs from game to game. I expect fighters to run at 60 fPS because in almost all cases their systems are based off of it so you dont want more or less whereas in say a FPS more would not be a problem. I feel the faster the game moves and requires you to act the more important the frame rate is but for games like Uncharted 4 I feel 30 would be fine, 60 would ofc be better but they obviously have other priorities i.e the visuals.
 

Mutant1988

New member
Sep 9, 2013
672
0
0
Game runs at 60 frames per second? Then the game doesn't look good in screenshots.

Game runs at 30 frames per second or less? Then the game looks good in screenshots.
 

wizzy555

New member
Oct 14, 2010
637
0
0
In addition to what others said. They have moved to new hardware and new game engines from ones that had been optimised over years. It takes time to squeeze out performance. Even pcs aren't running super well on latest hardware.
 

Adultratedhydra

New member
Aug 19, 2010
177
0
0
Every time someone says theres no difference between 30 and 60 FPS i just show them this.

http://30vs60.com/

I also love "The human eye cant see more than 30 frames per second." Well yes i'll give you that, -Because the human eye/brain doesnt see/process information in frames.
 

fix-the-spade

New member
Feb 25, 2008
8,639
0
0
Recently, this article hit yesterday with Naughty Dog talking about how hard it is for the studio to deliver 60FPS 1080p for Uncharted 4.
Very simple, the home consoles use cheap hardware and the priority in marketing is placed on salable screenshots and trailers, which means making it pretty over making it fast or reliable.

That both systems use cheap memory controllers that limit how much use they can make of their PC grade graphics chips and large RAM allocations is not helping either.

Maximum Bert said:
I feel the faster the game moves and requires you to act the more important the frame rate is but for games like Uncharted 4 I feel 30 would be fine
What about for people who suffer motion sickness (which is a lot), TV and Films are specifically played back at 72fps (24 individual frame played in groups of three) to avoid this, yet games at 30fps don't (and people like my father can't play for more than five minutes as a result).

To a lot of people 30fps is visibly flickery, it's what kept me away from last gens console too.
 

laggyteabag

Scrolling through forums, instead of playing games
Legacy
Oct 25, 2009
3,385
1,090
118
UK
Gender
He/Him
Because the consoles aren't as powerful as we were originally led to believe, and you cannot just crank out double the frames without some kind of compromise in terms of graphics.

Graphics sell games, and 60FPS is just something that a lot of console gamers don't care about. Is there an obvious undeniable difference between 30FPS and 60FPS? Yes. But is there also an obvious and undeniable difference between having 13 more flowers by that rock over there and having higher quality textures? Yes. Now, what looks better in screenshots or videos (that for the most part run in 30FPS anyway)?

It usually comes down to a question of priorities, and aside from the odd game that can manage both on a console, most of the time, it is generally graphics that get the priority.
 

Maximum Bert

New member
Feb 3, 2013
2,149
0
0
fix-the-spade said:
Maximum Bert said:
I feel the faster the game moves and requires you to act the more important the frame rate is but for games like Uncharted 4 I feel 30 would be fine
What about for people who suffer motion sickness (which is a lot), TV and Films are specifically played back at 72fps (24 individual frame played in groups of three) to avoid this, yet games at 30fps don't (and people like my father can't play for more than five minutes as a result).

To a lot of people 30fps is visibly flickery, it's what kept me away from last gens console too.
In that case I think the message from devs or at least publishers is that they dont feel the same way and that they believe lowering graphical fidelity to heighten the frame rate will have a more adverse effect on sales than the opposite sucks if it makes you motion sick but thats what the upshot seems to be and unless their games start making a lot of people sick or get headaches which reflects in sales then its not likely to change.

Games arent going to cater to every possibility even wide spread ones like say colour blindness or partial deafness etc. Im not saying 30fps is fine and we dont need to go higher but I can understand them making a call and they and everyone else will have to live by it. Some games do put frame rate above graphical fidelity almost all the time fighters being a prime example but for games like Uncharted I dont see that changing soon.
 

Pseudonym

Regular Member
Legacy
Feb 26, 2014
802
8
13
Country
Nederland
Adultratedhydra said:
Every time someone says theres no difference between 30 and 60 FPS i just show them this.

http://30vs60.com/

I also love "The human eye cant see more than 30 frames per second." Well yes i'll give you that, -Because the human eye/brain doesnt see/process information in frames.
I honestly can't tell which of those two is 60 and which is 30 without being told.

ShadowRatchet92 said:
Recently, this article hit yesterday with Naughty Dog talking about how hard it is for the studio to deliver 60FPS 1080p for Uncharted 4. I'm not a tech head and I sure as hell don't know dick about game design, but I just wanted to know why is it so hard? Why is t so hard to get a game running at 60FPS? I'm not even talking about P.C gaming doing it for years, but even games back on the PS3, 360, and Wii, even games on PST, Xbox, and Gamecube, could get 60FPS. Again, not a tech head or anything like that, and I'm not trying to come off as a jerk or anything like that. I'm just curious.
As far as I know it's a tradeoff. You can easily have either 60fps, or 1080p on current gen consoles. Both simultaneously however is diffcult. You can probably get an NES to run games on 60fps but it just won't look pretty.