So I'm not really a console gamer anymore, the last console I bought was a Wii, but I have been keeping up with the discussions on resolution and framerate with the newly launched consoles and it's starting to make me wonder if console games wouldn't benefit from having the same performance settings as PCs. (ie texture quality, anti-aliasing, v-sync, etc)
As far as I'm aware, consoles have always lacked most of the features that let you tweak graphics like on a PC, and for the most part it made sense to me: consoles have standardized hardware that can be optimized for, while PCs have variable hardware that needs to be compensated for.
However, I'm starting to question this rationale. Locking all these settings to get the same performance assumes that there's a certain baseline performance that everyone is shooting for. However it's clear that not everyone feels the same way about frame-rate, some people want 60 FPS or higher while others are perfectly happy with 30FPS. The same is true of resolution, some people may not even have TVs that can take advantage of 1080p resolution (like me) while others want it. Some people play games competitively while others just want to enjoy the atmosphere of the game's world.
I don't really understand why you would want everyone to have the same experience when it's pretty obvious we don't all have the same priorities. And for people who don't care or don't want to bother with these settings they can just stick to the default. There's really no major downside to this that I can see, it would probably even benefit PC users as these settings would most likely trickle up into the PC ports.
If there's some obvious technical reason this type of thing can't be done, then I apologize for wasting time here, but from what I can tell consoles are just standardized PCs.
What do you guys think? Do you agree?
As far as I'm aware, consoles have always lacked most of the features that let you tweak graphics like on a PC, and for the most part it made sense to me: consoles have standardized hardware that can be optimized for, while PCs have variable hardware that needs to be compensated for.
However, I'm starting to question this rationale. Locking all these settings to get the same performance assumes that there's a certain baseline performance that everyone is shooting for. However it's clear that not everyone feels the same way about frame-rate, some people want 60 FPS or higher while others are perfectly happy with 30FPS. The same is true of resolution, some people may not even have TVs that can take advantage of 1080p resolution (like me) while others want it. Some people play games competitively while others just want to enjoy the atmosphere of the game's world.
I don't really understand why you would want everyone to have the same experience when it's pretty obvious we don't all have the same priorities. And for people who don't care or don't want to bother with these settings they can just stick to the default. There's really no major downside to this that I can see, it would probably even benefit PC users as these settings would most likely trickle up into the PC ports.
If there's some obvious technical reason this type of thing can't be done, then I apologize for wasting time here, but from what I can tell consoles are just standardized PCs.
What do you guys think? Do you agree?