ResonanceSD said:
Scow2 said:
ResonanceSD said:
Pretty sure this is obviously wrong. The only drawcard for a console over a PC for a developer is that if you code for one X360, you can be damn sure that the code will work on every X360 ever released. The difficulty with PC systems lies in the fact that no two PCs are exactly the same.
But when the only difference between the machines is the numbers, not brand and manufacturer, they are much more consistent. An ATI graphics card doesn't function the same way as an
nVidea one does. A Toshiba computer is a different beast from a Dell, HP, or Acer computer of the same stats.
Being able to plug in a larger/better RAM and/or Video Card to access larger texture resolution/polycount/particle counts and/or reduce loading times is a good thing to me. Requiring me to get a new console to play a game branded as running with my current console, or allowing me to buy a console that doesn't work with the games supposedly made for it, is a bad thing to me.
uh, you're aware that different types of GPU from the same manufacturer will also function differently?
Also, you both contradicted yourself and proved my point. There's no point in consoles becoming upgradeable, because they still won't match the power of PCs and their convenience, whilst giving up their only competitive advantage for developers.
But the difference is small (if at all) if the GPU is from the same series.
And what's this bullshit about them not matching the power of PCs? When the 360 came out, it was superior to any PC within twice its price-tag, and the PS3 surpassed even that. It's not that they're underpowered, it's that they didn't upgrade as PCs did. I remember seeing projections for how long each of the consoles was supposed to last:
The 360 was predicted to be replaced in 2008, since its advantage was in the early start and raw power (Which would quickly get surpassed).
The Wii was predicted to be replaced in 2010, but nobody cares about that.
The PS3 was predicted t be phased out in 2014, because that's how long it was predicted developers would take to figure out how to make it work.
As far as convenience: The N64's memory boost-thingy and Xbox 360's hard drive proved it was VERY easy to make a console convenient and upgradable. There are three barriers in upgrading a PC:
1. Finding the right component you need out of the dozens out there, with the differently-advertised features of each one making it difficult to determine which one's actually better, and if it works for your computer's port. It's a lot easier to upgrade from "Xbox GPU Mk1" to "Xbox GPU Mk2" than it is to go from "Nvidia GeForce Whatever" to "ATI Pro Whatever" or "Radeon HD Infinite Whatever"
2. Price, which can be undercut to ensure market share due to proprietary control (Micro$oft sells its consoles for a loss, and makes it back and then some from Xbox Live and their cut from every game sold). Also, being the only seller of the component means they can mass-produce and package them for individual sale and be sure of moving all units.
3. Firmware updates, especially trying to figure out WHICH obsolete PoS in your system it needs to get the upgrade for and applying it, and hoping the rest of your system has the stats to figure it out.
4. The actual need to open the computer box and figure what goes where, and what's compatible with what. Making a hardware upgrade as simple as changing NES games makes this insurmountable-to-average-joe barrier completely trivial.
In macroeconomics, limiting choice and having sale monopoly of a market is a bad thing. However, in this case, in the proprietary model of Xbox upgrades, there's no real market - if people want out, they can put up with the hassle of PC gaming instead. Developers still have firm benchmarks, and even flexibility in what they want to do, because they know that regardless of whether the target console has 4 or 8 GB of RAM, or a GPU 1 or GPU 2, it still functions as long as all components meet the minimum ones it's rated for.