lacktheknack said:
CrystalShadow said:
Chibz said:
Exact same games? Yeah... I think that's kind of the whole problem in a nutshell. XD.
Those really rediculously expensive gaming PC's you're going on about have the capacity to make a 360 look like a seriously archaic joke.
A rough estimate would suggest a current high end PC is about 10 times faster, if not more.
But that counts for nothing when we get exactly the same games.
Because, aside from anything else, that means those games are designed around the lowest common denominator. (the 360 in this case), which means all that high-end hardware basically sits around doing nothing.
What's the point of having a game run at 400fps? It's good for a laugh, but other than that is just a waste of resources.
Oh well. I guess we won't see any real progress in games until the next console generation shows up...
Meanwhile... We're back to a pointless and arbitrary war. Yay!
This one as well (despite them trying to point out that we've only sunk into a dumb flamewar again, they couldn't resist lobbing some flaming poo on their way past).
Escapistmagazine.com forums, what happened to you?
Well of course. Although apparently, you've sunk to calling
any disparaging remark a flame... Not that I can take any of this seriously enough to give a serious response to, but in the end it amounts to the fact that both consoles and PC's are collectively emphasizing the worst aspects of the respective systems, and downplaying their strengths.
Used to be, cross-platform development between PC and consoles was relatively rare. Probably because the gap in performance was quite different to today.
In the past, consoles were better at some things, PC's better at others.
First it was raw processing power VS specialised graphical abilities. This eventually became specialised abilities VS more powerful but more generic abilities - Leading to consoles being able to pull off amazing 2d graphics PC's had a hard time with, but PC's being able to pull of the earliest half-way decent 3d graphics.
Unfortunately, with the advent of 3d consoles, and dedicated 3d hardware in PC's, the overall abilities of the two gradually converged to the point where the primary distinguishing feature is now the input device, and the difference in development choices between powerful, but frequently changing hardware in PC's.
VS more more predictable, but much more static (and thus rapidly outdated) hardware in consoles.
That, and the sudden increase in piracy concerns on the PC thanks to the highly open design, and increasingly easy access to the internet. (A pc can run all the tools needed to pirate a PC game, irrespective of it's security features. - to pirate a console game, you need to mess with the console, AND have a PC to do at least some of the work.)
But, despite that, the convergence in overall abilities means compared to consoles back in the past, where comparing a PC to a console was almost like comparing apples to Oranges, the modern situation is like comparing apples to slightly different apples.
It's this convergence that has been hurting both sides, even as it's made cross-platform games much more common.
Consoles are under pressure to compete graphically with PC's, for one thing. Something that used to be so implausible that nobody even considered it. (Not because console graphics were nessesarily inferior, but they tended to function rather differently, thus having radically different strengths.)
This means one of two things: Console games end up looking bad compared to PC games. Or PC games never live up to their potential, being hamstrung by games designed with console limitations in mind. (And if you think it's as simple as graphical effects, bear in mind that the level design of a well thought out game takes technical limitations into account.)
Consoles now have online systems as well, which predictably, has made developers lazy about debugging their games.
This would have been a disaster in older console generations. And in fact this is illustrated quite well by the bug in Metroid: Other M, which thanks to the Wii's inability to patch games required shipping out replacement game disks, AND asking customers to send in corrupted save files on an SD card.
That's pretty expensive. (Though not as expensive as a recall of a cartridge based game would have been.)
PC games have had patches just about forever though. But there at least, the reason seems quite understandable. PC hardware isn't standardised. And users have free reign to install just about any software they like on their PC, a lot of which can even be running in the background at any given moment. - Especially given the ability to multitask.
The hardware variations alone lead to hundreds of thousands of unpredictable variations. Add drivers and system software variations and you make that a hundred times worse. Add in installable software, and the situation becomes almost too chaotic to deal with.
Consoles shouldn't be having that kind of problem, yet somehow we're getting there. Probably because system updates now allow consoles to do this too. Sort of.
Anyway, I could go on and on listing this in unending detail...
But somehow I get the feeling the heart of this problem, which seems to be related to PC / console convergence, can be traced back to Microsoft.
I personally witnessed a game console crashing with a distinctly microsoft style BSOD. That was, in fact, a Sega Dreamcast, which, if you do your research, you'll find it ran windows CE...
The PS2 and Gamecube were both consoles in the traditional sense as far as I can tell. Yes, there was obvious convergence with PC's even there, in what their abilities were, but they seemed built around different parts and architectures.
Documents show the gamecube for instance was using TEV's (Texture Environment Units), which are an interesting design reminiscent of very simple hardware shaders, but working along quite different principles.
Around this time PC's got the first generation of Pixel & Vertex shaders.
And then of course, the Xbox showed up, and
really screwed everything up in terms of convergence.
Because, the original Xbox wasn't just a device with similar intent to a gaming PC... It
Was a PC. Every single component was recognisably a PC component. Right down to an intel Celeron processor, Nvidia graphics chip, and most of the usual peripherals.
These days, the PS3 pretty much uses PC graphics hardware. The 360 uses a prototype directX10 hardware design... Which only leaves the Wii, with a design lifted from the gamecube.
(But the rumours of Nintendo's next console, show... Surprise: PC graphics hardware.)
Truly, we have entered a period of such convergence that the end result seems to be the worst of both worlds. Irrespective of whether you prefer consoles or PC's.
If you can be bothered to read all that, i'll let you off for your flippant accusation. If not, clearly 'flaming' is the only thing anyone actually pays attention to anyway isn't it?