OutrageousEmu said:
So it becomes less that the hardware itself can't prduce the results and more that the developers don't want to optimise their results to get the best out of the hardware.
I'm really tired of people being so judgmental on developers. Look, the PS3's architecture is
really complicated. This is the reason why Gabe Newell was so annoyed about developing games for the PS3, and John Carmack himself said that there is probably not a single person on Earth who fully understands the PS3's architecture. If you just looked at the manual sets for everything about the core processor, the cell processors, the GPUs and the development environments on there, probably no-one even knows all the switches to the linker to optimise all the different things on here.
What, you think the developers spending years and millions of dollars working on multi-platform game engines like idTech 5, CryEngine 3, Frostbite 2.0 are all a bunch of lazies?
The fact that current generation engines can support deferred shading, morphological anti-aliasing, advanced environmental destruction on half a decade old hardware is amazing enough. Of course you're not going to get 60FPS at 1080p with just 256MB of RAM and another 256MB of dedicated GPU. Remember how I said CoD 4 run at 60FPS? The XBox also needs to scale down the game to 640p to support that. Same with Red Dead Redemption. Now are you going to say dedicated console game developers are lazy as well?
And I have yet to see the WiiU's hardware specs, I'll come to a conclusion after I see it.