Yea, but most AAA games that were primarily developed for the console market typically include many post-processing effects and other bells-and-whistles for the PC version that will cause even high-end systems to struggle to some extent. Titles like the Witcher 3 and even GTA:5 require a fairly pricey rig to run at 1080p/60 fps with all settings maxed out. If you consider playing at higher resolutions like 1440p and 4K, then very high end systems can be worked to their max as well. There is no single graphics card on the market right now that can run the most recent titles at 4K resolution/60fps. IMO, the PC hardware of today isn't being held back at all because consumer-grade hardware has not improved all that much in the past 3 years.NPC009 said:Keep dreaming. It's not just console pulling the PC elite down, it's other PC gamers as wellZipa said:Its simple, 60fps is objectively better than 30fps, that doesn't mean that a game running at 30fps in unplayable, they can still look and play excellently.
That said it is kind of a shame that games are still having to be run at 30fps at all thanks to the underpowered hardware in the consoles. The gap is likely to widen even further between PC and consoles as early as this year as both Nvidia and AMD have new GPUs coming on a new die size which will mean even better performance. I imagine before this console cycle is over PCs standard will be 1440p gaming at 60fps+ (Gsync and Freesync are likely to become more common and popular as the prices drop to).
Look at what Steam users are actually [http://store.steampowered.com/hwsurvey/processormfg/] using [http://store.steampowered.com/hwsurvey/videocard/]. The Intel Graphics HD 4000 is the most common videocard, mid-range CPUs everywhere!
Few developers are crazy enough to developer for only top-range systems. But, it would be nice if developers made a habit of putting in a frame rate slider so users can cap their's at whatever they feel like (and enjoy the wonders of Hearthstone or Minecraft at 144fps).
The notion that consoles hold back PC games is usually false. PC gamers simply tend to overestimate the capabilities of their hardware. The most recent fiasco was the graphical downgrade of the Witcher 3 from what was shown in the 2013 E3 demo. Naturally many blame the limitations of consoles for this downgrade, but fail to consider that most high end systems have difficulty running the final product at 1080p/60fps with maxed out settings. You would likely need some sort of ridiculous $2000+ rig to hope to run the Witcher 3 with the graphical fidelity shown at E3 2013. Why the devs decided to cut these setting out of the final product is less clear, but it is a bit of a moot point given that the vast majority of "hardcore" PC gamers would not be able to use them until 2-3 years after release.