Smooth Operator said:
And there is the whole "unification" crap on top, people with shit hardware would get offended if someone gets a better running game.
Funny, I don't see the PC space dominated by people complaining they can't run the latest games at max settings on their 3 or more year old hardware.
Let's look at two recent games, The Division and DOOM.
The Division's min specs are:
Processor: Intel Core i5-2400 | AMD FX-6100, or better.
RAM: 6GB
Video Card: NVIDIA GeForce GTX 560 with 2GB VRAM (current equivalent NVIDIA GeForce GTX 760) | AMD Radeon HD 7770 with 2GB VRAM, or better
And recommended are:
Processor: Intel Core i7-3770 | AMD FX-8350, or better.
RAM: 8GB
Video Card: NVIDIA GeForce GTX 970 | AMD Radeon R9 290, or better
DOOM's specs are:
Min:
Intel Core i5-2400 or better / AMD FX-8320 or better
8 GB RAM
NVIDIA GeForce GTX 670 (2GB) or better / AMD Radeon HD 7870 (2GB) or better
Recommended:
Intel Core i7-3770 or better / AMD FX-8350 or better
8GB RAM
NVIDIA GeForce GTX 970 (4GB) or better / AMD Radeon R9 290 (4GB) or better
Now, do you mean to tell me that someone running minimum specs for these games is going to get the same performance as someone running recommended or better? Is someone running the base recommended specs going to have the same experience as me on my rig with a 6 core i7, 64 GB DDR4 RAM, and a EVGA GTX 980 Ti Hybrid?
The answer to these questions is a resounding
NO. Those running min specs will be playing on almost everything on low settings to get a playable framerate. Those playing on the base recommended settings will not be able to push
everything to ultra, at a playable framerate, like I can on my more powerful than recommended machine. BTW, both games run beautifully on my machine, at max settings, with no overclocking on my end; thanks.
Yet, where are the cries from those with, as you say, "shit hardware" about how they can't max out every new game? After all, the PC gaming market is the father of "make it work on the lowest common denominator", not the console market. Hell, how many years after a new Windows version did games still have to support Windows 3.1? 95? 98? XP?
Hell, The Division will run, like crap, on a goddamn GTX 560! A card that's been around for, what? 5 and a half years? And you want to talk about "unification"? Like I said, the PC gaming market needing to run on legacy hardware and software (OS, DirectX/OpenGL versions) issues have been around for decades. It's the epitome of unification.