Fappy said:
It's hard to blame the devs, really. I imagine the tools they use can easily achieve these benchmarks and the key is simply how to fit that kind of performance into a shitty little box. PC devs that can't get a solid 60 fps don't really have an excuse, but I'll give console devs the benefit of the doubt here. I blame the hardware.
True, but while a PC dev might be able to hit that target on their own system, they'd have a very hard time knowing if
anyone else's system could match that.
Still, this is a little misleading.
1080p/60fps can be accomplished easily enough on any system that's even fundamentally capable of outputting that resolution and framerate. (The consoles from the PS2/gamecube/Xbox era for instance don't qualify, but most of the ones after do. The wii is an edge case, since according to hackers it has the internals to do it, but not the physical connectors needed to output such a signal)
Doing 1080p at 60fps is quite possible on any such system,
if you scale your graphical quality appropriately
The real issue here, is that these devs clearly have other priorities, and are pushing the graphical effects and detail quality higher
in preference to framerates and resolution.
That's fine, but it's basically a 3 way choice, and they are choosing something else in preference to high framerates.
So in terms of the 'next gen' consoles, what it really demonstrates is not that they can't do 1080p/60fps, but that the graphical standards the devs all seem to be aiming for are too much for the consoles to handle, and still have power to spare for higher framerates.
So... If 60fps were actually important, then it's clear devs are aiming the graphical standards of their games a bit too high.