Why Don't Consoles Have Performance Settings Like PCs?

J Tyran

New member
Dec 15, 2011
2,407
0
0
DoPo said:
Because it is the same hardware.
Not always, not exactly anyway. If you look at the 360 hardware it changed a lot over its life span and so did the PS2, by the end of both consoles life cycle the main CPU had shrank in die size several times and in the final revisions (PS2 slim & XBox 360 S) they had crammed their main and graphics processors into a kind of APU. The 360s Xenon CPU, its Xenos GPU & and the eDram from the Xenos chip where crammed into the XCGPU APU.

Functionally it was the same and gave similar performance at a greatly reduced power consumption and consequently heat output but in a technical sense it was a different set of hardware than the first 360s shipped with.
 

Canadamus Prime

Robot in Disguise
Jun 17, 2009
14,334
0
0
I would say presumably because the hardware on consoles is always a constant whereas the hardware on PCs is never the same from PC to PC so performance settings need to be adjustable so the game can perform at it's best on the PC it's running on.
 

Mikeyfell

Elite Member
Aug 24, 2010
2,784
0
41
Olas said:
Mikeyfell said:
um... because consoles aren't PC's...
That's just the nature of things

If you want to customize your experience you play on PC. If you want to..... um... there really isn't a benefit to Console gaming any more... but it used to be for the simplicity of "Plug in and play"
Now that consoles are basically just shitty PC's there could be like a setting that could let you switch between 1080 @ 30 FPS and 720 @60 fps or what ever.

But frankly I don't think anyone cares enough to include a feature like that
It seems like some people do care, at least Microsoft and Sony seem to think people do judging by their E3 presentations. Besides, if you want to play a game that's exclusive to a console you don't really have a choice in the matter even if you are a PC elitist.

I don't defend consoles, they're pretty much worse than PCs in every way, but it seems like this is at least one area where they don't have to be.
Back in the early 2000's consoles were awesome.
Back in the height of the 3D platformer era, before things like Firmware updates and DLC made Consoles shitty
back in the day when if you wanted to play a fun game without installing it you'd plug it into your PS2 and hope you still had space on your memory card.

Now days a good PC only costs like twice as much as a new console with infinitely more fun games to play.
I still like consoles, I'll probably always be a console scrub in my heart. but prefering console play to PC's is impossible to justify.
 

Remus

Reprogrammed Spambot
Nov 24, 2012
1,698
0
0
canadamus_prime said:
I would say presumably because the hardware on consoles is always a constant whereas the hardware on PCs is never the same from PC to PC so performance settings need to be adjustable so the game can perform at it's best on the PC it's running on.
This, exactly this. Every PC is different so before a game can load up on a PC it must first find ideal settings for which it can play on that PC. Which DirectX version are you using? What processor, what graphics card, what OS, how much RAM, how much harddrive space, how much power, what drivers? All of these things are different for every user, rarely is anything uniform. With consoles, every PS4 runs on the same hardware, every XB1 is the same, everything is uniform. This gives the developers the responsibility of choosing the graphic settings that would be optimal for that console, balancing graphic fidelity with framerate. This is why there will always be a market for console games and always for PC games. Console games are plug and play, no fuss. PC games are more customizable because it is required they be that way. The bigger question is how could you not figure this out yourself? I thought it was obvious.

Jandau said:
There is literally no reason not to give 2-3 preset options. If your argument is "I play on consoles not to have to deal with that stuff!", you are irrelevant to this discussion, since you'd just leave it on the default setting.
This is the crux of the argument and therefore a very big part of the discussion. Console and PC are 2 sides to the same coin, the yin, the yang. People play on consoles exactly for their convenience. When console gamers see better graphics at the expense of framerate, they see a broken product, while you may see cool desktop material and maybe a reason to buy that new Nvidia card. They should not have to lose that convenience, that constant reliability, to learn a whole other skillset they likely purchased a console to avoid in the first place.
 

Jandau

Smug Platypus
Dec 19, 2008
5,034
0
0
Remus said:
Jandau said:
There is literally no reason not to give 2-3 preset options. If your argument is "I play on consoles not to have to deal with that stuff!", you are irrelevant to this discussion, since you'd just leave it on the default setting.
This is the crux of the argument and therefore a very big part of the discussion. Console and PC are 2 sides to the same coin, the yin, the yang. People play on consoles exactly for their convenience. When console gamers see better graphics at the expense of framerate, they see a broken product, while you may see cool desktop material and maybe a reason to buy that new Nvidia card. They should not have to lose that convenience, that constant reliability, to learn a whole other skillset they likely purchased a console to avoid in the first place.
I literally don't understand what you're saying there. Honestly, what? Learn new skillsets? What skillsets? Broken products? Wait what? You don't seem to be adressing my arguments, but rather some imaginary post in which the author demanded full-on PC options menus on the consoles, which is something I specifically stated isn't necessary.

Ok, let me illustrate with a hypothetical example:

Take Watch_Dogs - At the moment the game is what it is, runs how it runs and has the graphical quality that it has. Now, take that exact game, unchanged in any way and add one setting to the Options menu. One. A setting that changes between "Normal" and "Fast" (or however they want to call it). Now, a console gamer who doesn't want to deal with anything can simply play the Default Watch_Dogs, which would be the same as it is now. And if someone thinks "Gee, I wish this game ran shoother", he can switch to the "Fast" option. Suddenly, the game looks slightly uglier, but runs better and more responsive.
 

Olas

Hello!
Dec 24, 2011
3,226
0
0
canadamus_prime said:
I would say presumably because the hardware on consoles is always a constant whereas the hardware on PCs is never the same from PC to PC so performance settings need to be adjustable so the game can perform at it's best on the PC it's running on.
I addressed this in my OP. I wonder how many people actually read it all the way through before replying because there are a lot of comments like this where people seem to think I'm not aware of how PCs differ from consoles.

As I said in OP, the reason I don't think this argument holds up is because not everyone will agree on what is "best". Just because all systems have the same resources, that doesn't mean everyone will want those resources to be allocated the same way.

Right now many console game developers are targeting 60fps 1080p for their games and sacrificing graphical quality to do so, and yet some people have been vocal about the fact that they don't care if games run at lower resolutions and/or framerates. It seems like if you want to satisfy as many people as possible you should give them options. I just can't see any reason not to.
 

Olas

Hello!
Dec 24, 2011
3,226
0
0
Jandau said:
Remus said:
Jandau said:
There is literally no reason not to give 2-3 preset options. If your argument is "I play on consoles not to have to deal with that stuff!", you are irrelevant to this discussion, since you'd just leave it on the default setting.
This is the crux of the argument and therefore a very big part of the discussion. Console and PC are 2 sides to the same coin, the yin, the yang. People play on consoles exactly for their convenience. When console gamers see better graphics at the expense of framerate, they see a broken product, while you may see cool desktop material and maybe a reason to buy that new Nvidia card. They should not have to lose that convenience, that constant reliability, to learn a whole other skillset they likely purchased a console to avoid in the first place.
I literally don't understand what you're saying there. Honestly, what? Learn new skillsets? What skillsets? Broken products? Wait what? You don't seem to be addressing my arguments, but rather some imaginary post in which the author demanded full-on PC options menus on the consoles, which is something I specifically stated isn't necessary.
Even if I demanded full-on PC options menus I don't see how it would be a problem as long as they were optional, with a preset default in place for people who never want to mess with them. Ideally you could just have a simple set of options on top like "Beautiful" and "smooth" with an advanced options menu underneath for people willing to go further under the hood.

However anything would be better than no options if you ask me. Having convenience and having options are not mutually exclusive, convenience can itself be an option.
 

Canadamus Prime

Robot in Disguise
Jun 17, 2009
14,334
0
0
Olas said:
canadamus_prime said:
I would say presumably because the hardware on consoles is always a constant whereas the hardware on PCs is never the same from PC to PC so performance settings need to be adjustable so the game can perform at it's best on the PC it's running on.
I addressed this in my OP. I wonder how many people actually read it all the way through before replying because there are a lot of comments like this where people seem to think I'm not aware of how PCs differ from consoles.

As I said in OP, the reason I don't think this argument holds up is because not everyone will agree on what is "best". Just because all systems have the same resources, that doesn't mean everyone will want those resources to be allocated the same way.

Right now many console game developers are targeting 60fps 1080p for their games and sacrificing graphical quality to do so, and yet some people have been vocal about the fact that they don't care if games run at lower resolutions and/or framerates. It seems like if you want to satisfy as many people as possible you should give them options. I just can't see any reason not to.
Sorry, that was the only explanation I could think of. I can only assume that developers assume that because the hardware is constant that they don't need to put in those settings. Whether that assumptions is right on their part is neither here nor there.
 

BloodSquirrel

New member
Jun 23, 2008
1,263
0
0
If you're going to offer multiple graphics settings, then you need to test and optimize for multiple graphics settings, and you need a more complex system in place so that both can be supported.

On consoles, where the game is designed to run at the very limits of the machine, I'm not convinced that this wouldn't come with tradeoffs. You're adding complexity to the product, and that always comes with issues.

And, no, the "Ship a bunch of broken stuff and use 'choice' as an excuse" Linux mentality is not going to fly in something that's actually expected to sell.