Why is 60fps so hard for Developers?

Recommended Videos

ffronw

I am a meat popsicle
Oct 24, 2013
2,804
0
0
TopazFusion said:
So let me get this straight, we're just over a year into the current generation of consoles, and the hardware already sucks ass and doesn't deliver what was promised? What the hell?
At this rate they're going to have to replace this generation of consoles MUCH sooner than they did the last. This generation is going to have nowhere close to the same longevity as the last generation had.
To put it bluntly, no.

This exact same thing happened with the PS3/360. The problem isn't that they need to replace the hardware - the problem is that any new console, by design, is completely outdated at launch.

Why? It's simple, really. They need time. Time to manufacture parts. Time to market. Time to design a case that doesn't look like crap for the damn thing. Once all that's done, they need to write software, test, test more, market it some more, and finally launch it. Oh, and don't forget they have to give devs some time to make some launch titles for the platform. By the time all of that is done, the platform is already something like 2 years old when it hits stores. That means it's already behind the times, tech-wise.

The PS4, for example, was playable at E3 2013, and probably feature complete months before that. At least a year prior to release, Sony was done making any appreciable change to the hardware. When they selected the GPU/CPU, they weren't using bleeding edge components either. They tried to get the most power they could while keeping costs down. That means mid-range GPU components. If they chose an early 2012 GPU, for example, you're talking about something like a GTX 645. Would you want to be running The Witcher 3 on that?

The reality of the console industry is that this trade-off has happened in every single generation. We're just feeling it more now because the pace of tech advances has slowed somewhat. Normally, PCs push the tech further, but even that has slowed of late. The Nvidia GTX 570 is over 4 years old, and we're only now getting games that won't run smoothly on it.

TL;DR: Don't expect this console gen to be a short one.
 

thatonedude11

New member
Mar 6, 2011
188
0
0
I'm not a game developer, but I'm going to try to explain what I think is happening based on my programming experience. I'm just guessing here though, and I could be completely wrong.

To put it simply, making games look better has been getting harder.

It used to be that to make a game look better, all one had to do was increase the texture quality and increase the number of polygons in the models. These things, while not trivial, are relatively simple problems to solve. You get better hardware, you can stuff better textures and models through it. Creating a better looking game was mostly a matter of getting better hardware, not a matter of difficult coding. (This isn't to say developers never had technological difficulties in ye olde days, but making a game look better was relatively simple)

However, over the last console generation, we have reached a plateau with how good our textures and models can look. That isn't to say there is no room for improvement, but there are definitely diminishing returns. So developers have had to use things like advanced lighting, particle effects, and detailed physics to make their games look better. The problem is, all of these things are computationally expensive, meaning that it takes a lot of processing power to execute any of these things. Combine this with the fact that many games are much more complicated and detailed then they used to be, and you've got games that do not run efficiently.

Finally, in the past decade, consumer level processors have been getting faster at a slower rate. To compensate, processor manufacturers have been making multi-core processors, which basically function like multiple processors glued together. In theory, a 4-core processor should be 4 times as fast as a single core processor of the same speed. However, writing code to take advantage of multiple processor cores is extremely difficult. In fact, if done improperly, it can actually execute programs slower and less accurately than a single core. When combined with the demand for increasingly better looking games, it's no wonder why developers are having trouble.

All of this leads to developers cutting corners and making trade offs. While some of these methods are rather clever, others are rather sloppy ("if we make the gun model huge and make the FOV narrower, we don't have to render as much!"). But the biggest trade off would have to be lowering the frame rate. If you only render half as many frames, you can spend twice as much time rendering them. Most players don't care, and you can get significantly better graphics if you limit your frame rate.

So in summary, the methods used to make games look good are getting more and more hardware intensive, and hardware architecture has been getting more difficult to use efficiently. This has caused developers to cut corners which often means cutting frame rate.
 

MysticSlayer

New member
Apr 14, 2013
2,405
0
0
When it comes to graphics vs. framerate, graphics will win almost every time.

From a marketing standpoint, developers release videos and screenshots of their game. They now rarely release demos, which is the only place where the 30-vs-60 fps difference can really be felt. During this marketing, very few people will ever raise a complaint that your game is running in 30 fps, but they will certainly complain if your graphics aren't up to standards. Even with "graphics don't matter" people, there are still far more people that I've seen complain about graphics than framerate, and the complaints about graphics tends to be more aggressive.

Compounding this is the fact that people probably aren't going to really ever notice the difference unless there's something to compare it to. Even with comparisons, such as on that 30vs60 website, it pretty much takes setting up an ideal scenario to view a few very minor differences. In other words, you have to know which one is which and watch the 30 fps for an extended period of time while making sure that 60 fps is covered up. Then you have to cover up the 30 fps before switching to 60 fps and then start watching the 60 fps. Even in that scenario, which should bring the difference out, I have to look very hard to notice the differences, and it only comes in some of the faster-moving sections. Basically, it's very hard for practically anyone but the most die-hard, framerate-preachy, PC fanatics won't notice the difference even with comparison (and I'm not even sure those people aren't really just seeing what they want to see), much less without one.

And then there is the user experience. Unless you're playing a game where twitch reflexes and/or timing are very important, 60 fps will not significantly improve the gameplay over 30 fps. And outside of a few action games (including shooters), some platformers, and the fighting and racing game genres, the actual experience of 60 fps won't make any difference to a vast majority of players. But woe to anyone who decides to make a game with graphics behind the times! Chances are, people will complain that it "hurts" their eyes, that your technology is outdated, and that you're a lazy developer.

And finally, if you walked up to the average gamer and started talking about 30vs60 fps, they wouldn't know what you're talking about. Most gamers I know don't even know that such factors exist. Graphics are something everyone, not just the most dedicated of gamers, can understand.
 

thoughtwrangler

New member
Sep 29, 2014
138
0
0
Anything involving game programming is "hard" in its way, but on the sliding scale of difficulty, 60 FPS itself isn't the challenge. It's game devs prioritizing the "wrong" things for that end. Even though art style and animation are what really set games apart, the studios are still stuck in the mindset of realism = good.

Because even though graphics have had sufficient verisimilitude for just about any purpose since the mid XB360/PS3 days HEAVEN FORBID the consumer see a "jaggy" that would require either several days of obsessive perusal or many overlapping mental disorders to notice in the first place. So we must push an envelope that is already fraying at the edges to begin with, and sacrificing the perfectly attainable goal of smooth gameplay and impressive animation fluidity to do so.
 

stringtheory

New member
Dec 18, 2011
89
0
0
Charcharo said:
Pseudonym said:
Adultratedhydra said:
Every time someone says theres no difference between 30 and 60 FPS i just show them this.

http://30vs60.com/

I also love "The human eye cant see more than 30 frames per second." Well yes i'll give you that, -Because the human eye/brain doesnt see/process information in frames.
I honestly can't tell which of those two is 60 and which is 30 without being told.

ShadowRatchet92 said:
Recently, this article hit yesterday with Naughty Dog talking about how hard it is for the studio to deliver 60FPS 1080p for Uncharted 4. I'm not a tech head and I sure as hell don't know dick about game design, but I just wanted to know why is it so hard? Why is t so hard to get a game running at 60FPS? I'm not even talking about P.C gaming doing it for years, but even games back on the PS3, 360, and Wii, even games on PST, Xbox, and Gamecube, could get 60FPS. Again, not a tech head or anything like that, and I'm not trying to come off as a jerk or anything like that. I'm just curious.
As far as I know it's a tradeoff. You can easily have either 60fps, or 1080p on current gen consoles. Both simultaneously however is diffcult. You can probably get an NES to run games on 60fps but it just won't look pretty.
My mind cant process how you cant see the difference :(
To be fair, with 30vs60.com, the most obvious difference between the two for me are the flamethrowers with 60 being smoother. Otherwise, I'd probably have to spend a much longer time squinting at all of the examples. Just a brief glance is hard to tell the difference, playing on the other hand...
 

Veldel

Mitth'raw'nuruodo
Legacy
Apr 28, 2010
2,263
0
1
Lost in my mind
Country
US
Gender
Guy
I don't understand this stupid hard on for 60 FPS.

Iv played both I don't notice any difference.

Fuck graphics fuck 1080 and above I'd rather have a well made game over a shiny turd any day of the week
 

Scrythe

Premium Gasoline
Jun 23, 2009
2,367
0
0
Every time this discussion comes up, I can't help but be reminded that despite all of the faults the game had, Sonic the Hedgehog 2006 ran at 60fps.
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
fix-the-spade said:
Maximum Bert said:
I feel the faster the game moves and requires you to act the more important the frame rate is but for games like Uncharted 4 I feel 30 would be fine
What about for people who suffer motion sickness (which is a lot), TV and Films are specifically played back at 72fps (24 individual frame played in groups of three) to avoid this, yet games at 30fps don't (and people like my father can't play for more than five minutes as a result).

To a lot of people 30fps is visibly flickery, it's what kept me away from last gens console too.
I think you are confusing motion sickness with sensitivity to flicker. Movie projectors have a bulb that actually flashes. If they only flashed once per frame there would be annoying flicker that would even make some people nauseous, but it would not be motion sickness because it would affect people even in very static scenes.

Modern TVs and monitors have the backlight on all the time, so there is no flicker period. Old CRTs do flicker, but they do what movie projectors do, they run at a specific speed (TVs work at 50-60Hz depending on where you live, monitors can be adjusted up to 85 or 100Hz) and if your game runs at 30fps, they just show each frame twice.

As far as actual motion sickness goes, I have heard more complaints about it when the fps is high. People complained about motion sickness with The Hobbit, for example. That's not a solid argument against 60fps games though, because it would be easy to have an option to lock the framerate at say 30fps.