Why Don't Consoles Have Performance Settings Like PCs?

Recommended Videos

Richard Dubbeld

New member
Nov 8, 2011
26
0
0
I think a more important question is this: how do consoles released at the beginning of a generation play games released at the end of that same generation just as well as a newer version of that same console?

Tis something i have always wondered.
 

loa

New member
Jan 28, 2012
1,716
0
0
Because it's always the same hardware the game has been developed for.
Bump it up, it'll lag and not having having to worry about lag ever is kind of the point of consoles.
In theory at least.
 

DoPo

"You're not cleared for that."
Jan 30, 2012
8,665
0
0
Richard Dubbeld said:
I think a more important question is this: how do consoles released at the beginning of a generation play games released at the end of that same generation just as well as a newer version of that same console?

Tis something i have always wondered.
Because it is the same hardware. This is the entire point of a console - each console is the same across the generation. Well, mostly the same - some may be smaller or have a new optical device or something, but the hardware for running the games is the same.
 

votemarvel

Elite Member
Legacy
Nov 29, 2009
1,353
3
43
Country
England
Console games do have graphics settings though, it isn't a new thing. The previously mentioned Mass Effect for example lets you turn off the film grain and motion blur.

So if people are accepting of things like that, then why not settings which could alter performance? As mentioned it is not that you would have to use them, you could stick with the defaults if you wanted.
 

krazykidd

New member
Mar 22, 2008
6,099
0
0
Sniper Team 4 said:
Er...I'll be honest here: all of that stuff? That doesn't matter to me. I don't know what the difference is between frame rates or pixel resolution or any of this stuff. And personally, I don't want any of those options on my console. I play on consoles because I cannot figure out the technical stuff of computers. You guys are basically speaking a different language to me. If these options start showing up on consoles--if I start getting messages when I put a game in that say, "Which resolution would you like to run? Which OS? Where do you want to save?" and all that other stuff that comes with installing a computer game, I'm going to be ticked. I play on consoles because they are simple and easy to deal with. I don't have to fiddle with settings in order to run a game.
But that's just me. I'm sure there are other people out there who would love to have those options, but it makes me wonder why they don't just play on the PC.
I agree with you 100%. I'm not tech savvy in the slightest, it's why i have owned Many a console, and neery a gaming pc. Hell the fact that we must install games now annoys the hell out of me.
 

rofltehcat

New member
Jul 24, 2009
635
0
0
They want everyone to have the same "optimal" experience. I actually think that giving the option between two modes or so (e.g. 30 FPS with prettier effects and 60 FPS with lower FPS) would be a good way to fix problems for people that do not react well to lower frame rates.
They have the different effects and textures anyways from the PC version.

However, I'm sure that during the level design of some games (e.g. corridor shooters) they are going through them and add/remove/tweak all effects in them so they are just at the maintainable FPS goal. Well, at least on consoles they are probably doing this. On PC they are surely writing minor FPS dips and spikes off as minor problems (you can always disable something if your PC is a few % off their test machines). So that effort would probably have to be doubled (or abandoned, but good luck on that).
 

Lilani

Sometimes known as CaitieLou
May 27, 2009
6,581
0
0
With PCs, devs can't accurately guess how much processing power the player's computer has. So they give options so that the person who doesn't have a $1500 gaming rig can enjoy the game.

With consoles, devs know exactly how much processing power the player has. Every PS3 has the same processing power, and the same with every Xbox 360, 3DS, and PS Vita. And every PS4, Xbone, and WiiU. So all they have to do is design the game to run fine with that processing power and voila, they're good. It takes extra time and coding to make all of those graphical options, and with consoles there really isn't a need for any of it, so they don't have to waste time on that.

Of course, this is all in theory. It can all go to hell if a dev makes a game that does exceed the console's processing power, or would play better if things were toned down.
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
More to the point of what some others have said, they don't have to. Simple as. It's a necessity for the PC market. It's not for consoles.
 

x EvilErmine x

Cake or death?!
Apr 5, 2010
1,022
0
0
Sniper Team 4 said:
... I don't know what the difference is between frame rates ...
The clue is in the name, do you know what a flip book [http://en.wikipedia.org/wiki/Flip_book] is? Computer monitors, TV's and film projectors all work in essentially the same way. They display still images in rapid succession to create the illusion of movement to your eye. The more images they display (known as frames) in a given amount of time (one second) the smoother the illusion of movement becomes to the eye. So 60 frames per second (fps) is more than 30 frames per second (fps) which means that a game running at 60 fps will make movement seem smoother than if it was only running at 30 fps.

or pixel resolution ...
Again the clue is in the name. You know those large billboard adverts you see everywhere advertising stuff? Ever gotten really close to one? Or for that matter when you were a kid did you ever sit really close to the TV? If you did then you will have noticed that the picture is really made up of lots and lots of dots of ink, or in the TV example lots of little blue, red, and green squares. These are pixels. They are individual blocks of colour that when viewed at a distance merge to form an image. Here's a great visual example of what I mean.



As you can see when the picture is magnified you can see that it's actually made up of lots and lots of little dots of paint. When you look at an image on a screen like a TV or computer monitor it's essentially doing the same thing as the painting. The more little dots that you have then the more detail you can see in the image.



In this image shows a comparison between a low resolution texture and a high resolution texture as an example.


or any of this stuff. And personally, I don't want any of those options on my console. I play on consoles because I cannot figure out the technical stuff of computers. You guys are basically speaking a different language to me. If these options start showing up on consoles--if I start getting messages when I put a game in that say, "Which resolution would you like to run? Which OS? Where do you want to save?" and all that other stuff that comes with installing a computer game, I'm going to be ticked. I play on consoles because they are simple and easy to deal with. I don't have to fiddle with settings in order to run a game.
But that's just me. I'm sure there are other people out there who would love to have those options, but it makes me wonder why they don't just play on the PC.
I understand that not everyone is technical minded but it's not really all that hard to understand. Unfortunately people who are into all this stuff tend to use the big complicated sounding words and abbreviations that leave the average guy or girl feeling like they are speaking in some sort of code. Don't worry about it, just ask or google it and you can usually find a good explanation in simple straightforward English. Oh an if you come across anyone who tries to put you down for not knowing about this stuff then they are an ass hat and there opinion isn't worth jack anyway so don't sweat it.

OT

Well, to be honest consoles don't need them as they are all standardized hardware. However I DO wish that more games would include options to adjust visual effects, things like turning off bloom or motion blur. That would be nice if it became a standard thing.
 
Apr 5, 2008
3,736
0
0
There are a couple of issues which make it unwise to give console gamers the option.

- The game is designed to look a certain way and to do so on every console.
- Even newer, more powerful revisions of console hardware (eg. the various iterations of the 360) were underclocked to match precisely the performance of older hardware (in a similar way to how car makers use the same engine in many cars with different power outputs).
- It would create an issue hitherto unheard of in multiplayer games. A player with lower graphics settings and a higher frame rate will unquestionably have an objective edge over someone with a lower frame rate. Back in the day, I remember mods for UT that removed gibs/blood and turned death effects into a golden sparkle. I heard tell of pro Quake II/III players who used mods that turned character models into wireframes to eke out that much more performance and give them an edge. Players at 60FPS have a distinct advantage over players playing at 30FPS.

- Not an issue per sé, but it's not really necessary for consoles either. A developer that has optimised their game reasonably well can be assured of a consistent experience for all gamers on a given platform. The settings benefit PC gamers since practically every one of us has a different hardware configuration and thus need to lower various things to find the ideal balance of performance and visual fidelity. For an older, high spec machine lowering shadows and reflections to medium may be enough. For older, mid-spec machines and lower, they may have to lose some texture quality and antialiasing. But all of us will be able to play it at similar frame rates.

Personally, I think console games are fine without. The gamers who game on them aren't interested in the best gaming experience, just an easier one. We get a better a experience, but at the cost of some extra effort and/or money. What I would wish for is that every third/first person game on the PC have mandatory settings to change FoV. I hate the narrow FoV on console ports to the extent that if I cannot change it, I generally won't play the game.
 

votemarvel

Elite Member
Legacy
Nov 29, 2009
1,353
3
43
Country
England
Lilani said:
With PCs, devs can't accurately guess how much processing power the player's computer has. So they give options so that the person who doesn't have a $1500 gaming rig can enjoy the game.

With consoles, devs know exactly how much processing power the player has. Every PS3 has the same processing power, and the same with every Xbox 360, 3DS, and PS Vita. And every PS4, Xbone, and WiiU. So all they have to do is design the game to run fine with that processing power and voila, they're good. It takes extra time and coding to make all of those graphical options, and with consoles there really isn't a need for any of it, so they don't have to waste time on that.

Of course, this is all in theory. It can all go to hell if a dev makes a game that does exceed the console's processing power, or would play better if things were toned down.
Surely then the consoles should have even more options?

The developers know exactly the hardware and software they have to work with. Therefore it should be even easier to give people options.
 

Easton Dark

New member
Jan 2, 2011
2,366
0
0
KingsGambit said:
- The game is designed to look a certain way and to do so on every console.


Not just a response to you, but to most people in here.

Developers are not the end-all for the best way to play their own game, that should be up to the people actually playing it. If they don't want to touch the options they don't have to, but if they want to play it at 60fps or at an FOV that doesn't make them vomit, please for the love of options just let them.

loa said:
Bump it up, it'll lag and not having having to worry about lag ever is kind of the point of consoles.
In theory at least.
PS3 Fallout 3 and New Vegas would like a word.

The days of your safety from technical problems, though they never really existed, are over.
 

Lilani

Sometimes known as CaitieLou
May 27, 2009
6,581
0
0
votemarvel said:
Surely then the consoles should have even more options?

The developers know exactly the hardware and software they have to work with. Therefore it should be even easier to give people opens.
But what would be the point of that? With consoles, the processing power is set, so they can't exactly make an option which is more than the console can handle. So the only options would be "Best, but is still guaranteed to run just fine" and "Less than best, but also plays just fine." Why would anybody select a less-than-best option when the best runs just as well? It would be a waste of time and effort. That'd be like a car with power windows, but also a hand crank built in just in case you're dying to crank up the window instead of using the automatic button.

I guess a choice between 60 FPS and 30 FPS wouldn't be a difficult option to integrate in, but to give a console game all the sliders and options that PC games have would be pointless.

Again, the reason for giving options in PC games is because people play PC games on everything $1500 gaming rigs to $100 used laptops. They design the game so that it's playable on the average computer, but have even more ideal graphical options for the people with hardcore machines, and toned-down graphics for those who have slower machines.
 

WouldYouKindly

New member
Apr 17, 2011
1,431
0
0
Because it's presumed that the game will look as good as it can while still having an acceptable frame rate. What constitutes an acceptable frame rate is up to you.

I think it wouldn't be wrong to have two settings. One which will run at 60 FPS and one that will run at 30 FPS. In most single player games, the FPS matters a bit less, so you can get the game to look as good as possible. However, if you want to play the multiplayer portion, it would be better to have a higher frame rate. Giving people the option rather than locking to 30 for both modes wouldn't be a bad idea.
 
Apr 5, 2008
3,736
0
0
Easton Dark said:
KingsGambit said:
- The game is designed to look a certain way and to do so on every console.


Not just a response to you, but to most people in here.

Developers are not the end-all for the best way to play their own game, that should be up to the people actually playing it. If they don't want to touch the options they don't have to, but if they want to play it at 60fps or at an FOV that doesn't make them vomit, please for the love of options just let them.
I never said anything about "best way". What I meant was "consistent" way. Best is irrelevant. Every console gamer will have the exact same experience (in a way, that's what console gamers are paying for, an assured experience). Ultimately, performance is the number one factor so they tune visuals to get the best looks they can with a fixed performance expectation (and fixed resolution, another variable on PC, albeit a mandatory one).

They kind of cheat on consoles, in a way. By forgoing 60FPS in favour of 30FPS, what they're saying is "we can't design a game that both looks and performs brilliantly. So we'll knock performance down by half and use the extra horsepower for visuals." But as long as games are happy playing at 30FPS (something unheard of back in the day. Ironically, while hardware power has increased, games perform worse (albeit a great deal more)) they'll get away with eking out extra visuals on that basis alone.

If a developer *could* give you the best possible experience both for performance, and visuals on even the lowliest machine (or if we all had identically specced PCs), the settings wouldn't be required for PC either. They account for different PC configurations and capabilities whereas with consoles, you can optimise for one hardware platform. If a console gamer wanted the "best" experience, they'd be playing on a PC; the trade off is simplicity and (generally) a lower initial outlay.
 

Easton Dark

New member
Jan 2, 2011
2,366
0
0
KingsGambit said:
Ultimately, performance is the number one factor so they tune visuals to get the best looks they can with a fixed performance expectation (and fixed resolution, another variable on PC, albeit a mandatory one).

They kind of cheat on consoles, in a way. By forgoing 60FPS in favour of 30FPS, what they're saying is "we can't design a game that both looks and performs brilliantly. So we'll knock performance down by half and use the extra horsepower for visuals." But as long as games are happy playing at 30FPS they'll get away with eking out extra visuals on that basis alone.

If a developer *could* give you the best possible experience both for performance, and visuals on even the lowliest machine (or if we all had identically specced PCs), the settings wouldn't be required for PC either. They account for different PC configurations and capabilities whereas with consoles, you can optimise for one hardware platform. If a console gamer wanted the "best" experience, they'd be playing on a PC; the trade off is simplicity and (generally) a lower initial outlay.
For some developers it's performance as #1 (hello Platinum games). For some it's visuals (hello The Order). It's an argument I see frequently over which is more important.

If a developer can't give the best possible combination of stunning visuals and unrivaled performance, I feel like the least they could do to make gamers of both persuasions happy is give a binary option like FF14 does to either make it pretty, or make it run better.

Not even getting into settings that can cause or ease physical illness, like FOV or bloom.

In other words, I don't think different graphics options are only meant for different spec PCs. I see no reason console gamers should be denied choice.

Sorry about the gif, I feel bad for using it now. I shouldn't do that.
 
Apr 5, 2008
3,736
0
0
Easton Dark said:
KingsGambit said:
Ultimately, performance is the number one factor so they tune visuals to get the best looks they can with a fixed performance expectation (and fixed resolution, another variable on PC, albeit a mandatory one).

They kind of cheat on consoles, in a way. By forgoing 60FPS in favour of 30FPS, what they're saying is "we can't design a game that both looks and performs brilliantly. So we'll knock performance down by half and use the extra horsepower for visuals." But as long as games are happy playing at 30FPS they'll get away with eking out extra visuals on that basis alone.

If a developer *could* give you the best possible experience both for performance, and visuals on even the lowliest machine (or if we all had identically specced PCs), the settings wouldn't be required for PC either. They account for different PC configurations and capabilities whereas with consoles, you can optimise for one hardware platform. If a console gamer wanted the "best" experience, they'd be playing on a PC; the trade off is simplicity and (generally) a lower initial outlay.
For some developers it's performance as #1 (hello Platinum games). For some it's visuals (hello The Order). It's an argument I see frequently over which is more important.
Well, visuals are important but the cannot be "more" important than performance. Performance is the one thing that cannot be compromised. Visuals can be altered (AA/ASF on or off, lower/higher quality textures, lower/higher poly models, lower/higher resolution) as can other aspects (shorter/longer duration decals, number and complexity of AIs, physics objects, etc) but performance has to be at a minimum playable benchmark. If performance is not up to an acceptable standard then the game is poorly optimised and/or coded, full stop. The best looking game in the world that runs at 1FPS isn't a game, it's an unplayable slideshow.

30FPS has been shown to be a playable standard. So console game makers settle for that and use the performance savings to increase other aspects (visuals, AI, dynamic effects, physics, etc). On a console, it's about uniformity. 60FPS is objectively a better, smoother experience than playing at half that, however console gamers are content to accept it as a minimum.

Easton Dark said:
I feel like the least they could do to make gamers of both persuasions happy is give a binary option like FF14 does to either make it pretty, or make it run better.

Not even getting into settings that can cause or ease physical illness, like FOV or bloom.
I agree, though not in the case of multiplayer games where it will disadvantage some. We accept this on the PC and I think it's a given that we as players accept responsibility for our own performance levels and that other players may be experiencing the same game in a better or worse way, simultaneously. Not having FoV settings at a minimumm IMO is out of order tho I appreciate it can cause issues. But on PC it should be enshrined in law, carved in stone and in every handbook.

Easton Dark said:
In other words, I don't think different graphics options are only meant for different spec PCs. I see no reason console gamers should be denied choice.

Sorry about the gif, I feel bad for using it now. I shouldn't do that.
It comes down to necessity. If a PC game dev could be assured that everyone could experience the game as best as possible, we wouldn't need advanced settings either. The only real choice console gamers *could* be offered would be two presets of "30FPS, standard visuals" or "60FPS, lower visuals", otherwise the game would suffer stuttering, artefacts and screen tearing at anything in between.

Personally, I would sacrifice 60FPS for 30FPS @ 1080p. The reason for this is that LCD screens have "native resolutions", basically the actual number of physical pixels. My HD television physically has 1,920 horizontal pixels for 1,080 vertical lines. A console game running at 720p or anything else is leaving it up to the television to "fill in" and stretch the image intelligently to occupy 1920x1080 pixels. Running content at other than native res on an LCD screen is visually, noticeably worse than running at the correct res. This same effect can be demonstrated by taking any image of, say 640x480 f.ex, and in Photoshop, resizing it to 1920x1440 and seeing how the result looks.

Whether you believe it or not, the advanced options really are just there to allow for different PC configs. We each of us have different GPUs, GPU memory (VRAM), RAM & processor (power/speed), the main factors of how a game runs. I would bet money that if you found any 20 PC gamers every single one of them would have a different hardware config (not even mentioning OS, driver version, background apps, etc).

I specced my PC so I could run everything at max, in every game released to date (and the foreseeable future), at 1080p and 60FPS without any compromises (and paid 3x the cost of a current gen console for the privilege). Someone with less video memory than I have would benefit from lowered texture quality and lower poly models. Someone with a weaker GPU would need to turn off post processing or lower shadows, reflections f.ex to have the same performance as me. Or they could play a lower res and/or framerate but maintain the same visuals as I enjoy.

Consoles don't have these issues because RAM, CPU, GPU and VRAM are identical across the board (if not necessarily the same hardware (as I illustrated with the 360 example [http://forums.digitalspy.co.uk/showthread.php?t=901401]), certainly identical performance). Thus the "need" for them is gone and the only reason to include them might be to give players a choice (tho, in so doing, increases dev workload by needing two sets of textures/models, etc). Not to mention issues like this from Capcom [http://www.escapistmagazine.com/articles/view/video-games/conferences/e32014/11694-Dead-Rising-3-PC-Version-Will-Run-Into-Issues-Above-30-FPS-Says-Capcom]. Console gamers have Brightness options strictly to allow for different lighting conditions/screen brightness.

The fact is, devs/publishers are lazy and want to do as little as possible in terms of finances and time. If they can push out and sell something "acceptable" they'd rather do that than make a better product with extra options. Heck, look at CoD/AssCreed...same game, same engine, re-released year on year for full price. It's the thing every developer dreams of...almost no work, no innovation, no investment, just re-release the same thing year on year and get paid for it. And you want these people to give you options when they don't need to?
 

Jandau

Smug Platypus
Dec 19, 2008
5,034
0
0
There is literally no reason not to give 2-3 preset options. If your argument is "I play on consoles not to have to deal with that stuff!", you are irrelevant to this discussion, since you'd just leave it on the default setting. Seriously, it wouldn't be a factor to you. All it would do is give people who'd like the option of the game running smoother at the expense of some visual bling what they want.

They don't have to include all the various settings you see in PC options menu, just make a setting like "Normal", which would be what you usually get, and "Fast", which would be a slight visual downgrade that would improve framerate. That's it.