Victim of Technocide

Bete_noir

New member
Apr 6, 2009
10
0
0
The console is simply too simple. It's too damn efficient. You don't have to worry about this, that, and the other thing. RAM, or cards, you just pop the disc in and play that sumbitch till it red-rings or finishes your grilled chicken (In the case of the PS3)

Now that the gaps between consoles and PC's are getting smaller the problem is the PC is too complex a monster for the largely computer-fluent world. I know I can't put a PC together. I've got one but my sixty-two-year-old dad knows more about doing it than I do.
 

dnadns

Divine Ronin
Jan 20, 2009
127
0
0
Shamus Young said:
The sad thing is, I don't see how this could have been averted. What was NVIDIA going to do, not sell graphics cards? Should gamers have not bought them? Should developers have just ignored the advanced hardware? Everyone acted rationally. Everyone moved forward because they wanted the PC to prosper. And yet, the changes they brought about ended up building a wall around PC gaming.
I'd say it is the same problem that showed up in almost any other area, too.
Companies went on to offer highly proprietary API's, chipsets and drivers to diversify themselves from other vendors. It was exactly at the point when you had to start to compare if the game supports the ATI or NVIDIA stuff.

If they had kept to standards (i.e. OpenGL) and tried to develop those, things would have gone a lot better for PC gamers.

Now I still use my PC for gaming, but it's only for the old stuff that I missed (or miss again, like Fallout). The rest happens on the PS3 and I don't intend to change that in the near future.
 

WhiteTigerShiro

New member
Sep 26, 2008
2,366
0
0
To put it in analogical terms, the life of the PC is like an RPG. Towards the beginning of the game, most characters can do most things with a passable efficiency. Warriors can hurl spells for decent damage, a Mage can take a number of hits before finally dying, and an Archer can deal just as much damage in melee as he does with his bow and arrow.

That was roughly 10 years ago for the PC, and now the PC is at the higher levels of said RPG where only the Warrior who's put points into Strength and Endurance can take hits without dying quickly, the Mage is only useful if he's casting a spell, and the Archer would get laughed at for even considering unsheathing his sword. You can build some powerful computers that can do some impressive things, but if you haven't invested in your magic stats, then you better not be casting Fireballs.
 

KDR_11k

New member
Feb 10, 2009
1,013
0
0
I guess the influence of the PC depends on where you live, in Germany it still gets the most shelf space out of all systems.

That modern low end PCs can't run games at all is more the fault of the games by requiring shaders and such to run which integrated chipsets often don't have. Besides, they can still run flash games on the internet which are a pretty massive market right now.
 

man-man

Senior Member
Jan 21, 2008
163
0
21
Cheeze_Pavilion said:
if people needed the latest GPU in order to Twitter, PC gaming would look a lot different.
Sweet Zombie Jesus... Windows Vista was an attempt to save PC gaming.
 

Threesan

New member
Mar 4, 2009
142
0
0
ratix2 said:
threesan: [...] and even then using the controller to use the browser isnt very friendly, espically when typing. and its like this for EVERY console browser, from the psp, to the dsi and wii.

however, consoles are FAR from replacing pc's. using my ps3 as an example here. the internet browser the ps3 has is barely adequate. [...]
You describe a software deficiency, which, while real and blocking console evolution, isn't the same as what the hardware is fundamentally capable of doing. Maybe the Xbox 1080 will sport significant in-box storage (assuming blazing-fast internets doesn't replace this, as _some_ are predicting) and run Windows 8. Then maybe you've got a general purpose, uniformly mass-produced computer in a sealed box--which you could offer at the console price reduction (fueled by licensing development and lowered dev, & manufacturing? costs) for the system. Maybe consumers with le money and le know-how would prefer to assemble a system à la carte, but developers could experience massive market pressures to migrate towards developing for the few big-market console-computers, with ports into the inhomogenous olde-computer market being unrealistic.

I suppose I'm raving now, but maybe one of these big-market console-computers would follow more in the Linux line, and up the price of the box (at the cost of market share?) in favor of license-free development...

(Adendum. Two assumptions I'll make explicit: 1) Driver model unification can help, but will never achieve a complete abstraction of underlying hardware. And 2) Operating systems aren't cheap, but you'd need one regardless of your choice of olde-computer versus console-computer, and as such the price is safe to ignore. Actually, that might be a dangerous assumption. A significant portion of the market may not want or need an OS in their console.)
 

TWEWER

New member
Feb 8, 2009
121
0
0
Really? People are still bitching about PC games being dead. Sure, they were dying about 2 or 3 years ago they were, but now, thanks to Steam, they're going strong again.
 

Dorian Cornelius Jasper

Space Robot From Outer Space
Apr 8, 2008
396
0
0
TWEWER said:
Really? People are still bitching about PC games being dead. Sure, they were dying about 2 or 3 years ago they were, but now, thanks to Steam, they're going strong again.
But the market dominance isn't as strong as it once was, and many developers see the PC as ultimately secondary to consoles and develop accordingly. Hence the "glory days gone by" feeling.

Valve happens to be an exception, not the rule.
 

thermo1

New member
Dec 10, 2008
96
0
0
Possibly unseen angle: One advantage/disadvantage(depending on your point of view) of the pc over the console is that pirate games are more readily available from torrents etc. and this variable that doesnt affect the console industry arguably creates aloss of revenue for pc game devolopers who have less and less incentive to create games for the pc especially when ,as others above have said, since the average pc owner cant play most of the new gen games. Pirateing insnt really a problem for console devolpers.

As a Pc gamer myself I find my self looking further and further it the past pre-2006 for good games that i may have missed out on playing that i know will run on my pc now a days.


However I do find steam very helpful... at makeing me spend money
 

51gunner

New member
Jun 12, 2008
583
0
0
Bete_noir said:
The console is simply too simple. It's too damn efficient. You don't have to worry about this, that, and the other thing. RAM, or cards, you just pop the disc in and play that sumbitch till it red-rings or finishes your grilled chicken (In the case of the PS3)

Now that the gaps between consoles and PC's are getting smaller the problem is the PC is too complex a monster for the largely computer-fluent world. I know I can't put a PC together. I've got one but my sixty-two-year-old dad knows more about doing it than I do.
That's a pretty good point. To hook up an Xbox, you don't need to know a thing about what's inside it. When you buy a console, you know that provided the name on the console matches the name on the game, you're in business. Plug power cable in, plug AV cable in, good to go. I imagine it also makes games a little easier to develop because you know the exact system you're going to be working with and what specifications to cater to. I imagine PC developing is a crapshoot where you place the bar... "here" for system requirements.

I find Therumancer's counter-argument somewhat amusing: WoW is in part so popular because you can play it with a very plain machine. You don't need a fancy gaming computer to handle the ol' World of Warcraft.
 

Halbyrd

New member
Feb 17, 2009
11
0
0
man-man said:
Sweet Zombie Jesus... Windows Vista was an attempt to save PC gaming.
Guess Bill doesn't look so silly now, does he?

dnadns said:
I'd say it is the same problem that showed up in almost any other area, too.
Companies went on to offer highly proprietary API's, chipsets and drivers to diversify themselves from other vendors. It was exactly at the point when you had to start to compare if the game supports the ATI or NVIDIA stuff.

If they had kept to standards (i.e. OpenGL) and tried to develop those, things would have gone a lot better for PC gamers.
Erm...I'm not sure what rock you've been living under, but we haven't seen games that *require* ATI or *require* Nvidia specifically to run for years now. This is all thanks to that magical little thing called DirectX that Microsoft spent a lot of time flogging a while back, if you'll recall. Sure, there have been "plays best on..." labels slapped on games for a while, but that's mostly marketing hype. You might have a bit of a point if you're talking about Linux gaming, as ATI cards have basically not worked on Linux for years, but let's be honest, the state of Linux gaming has been in sorry shape from day one, and we can hardly lay all the blame on the manufacturers for that one.

Me, I place the blame for the segmentation of the market on Intel, and their awful, awful, AWFUL integrated graphics chips. They were selling their chips cheaper than anybody else, and bargain PC makers bought them by the boatload. Thus, the baseline system wasn't keeping pace with what game devs needed to produce a decent-looking game.
 

dnadns

Divine Ronin
Jan 20, 2009
127
0
0
Halbyrd said:
dnadns said:
I'd say it is the same problem that showed up in almost any other area, too.
Companies went on to offer highly proprietary API's, chipsets and drivers to diversify themselves from other vendors. It was exactly at the point when you had to start to compare if the game supports the ATI or NVIDIA stuff.

If they had kept to standards (i.e. OpenGL) and tried to develop those, things would have gone a lot better for PC gamers.
Erm...I'm not sure what rock you've been living under, but we haven't seen games that *require* ATI or *require* Nvidia specifically to run for years now. This is all thanks to that magical little thing called DirectX that Microsoft spent a lot of time flogging a while back, if you'll recall. Sure, there have been "plays best on..." labels slapped on games for a while, but that's mostly marketing hype. You might have a bit of a point if you're talking about Linux gaming, as ATI cards have basically not worked on Linux for years, but let's be honest, the state of Linux gaming has been in sorry shape from day one, and we can hardly lay all the blame on the manufacturers for that one.

Me, I place the blame for the segmentation of the market on Intel, and their awful, awful, AWFUL integrated graphics chips. They were selling their chips cheaper than anybody else, and bargain PC makers bought them by the boatload. Thus, the baseline system wasn't keeping pace with what game devs needed to produce a decent-looking game.
Well...

uname -a
Linux sardaukarmobile 2.6.28-11-generic #42-Ubuntu SMP Fri Apr 17 01:58:03 UTC 2009 x86_64 GNU/Linux

;-) (but that does not really have something to do concerning the gaming as there is no such thing on Linux)

I do know that driver issues and DirectX support changed over time, but the issues back then were enough to still keep me from going back to hardware-demanding PC gaming. But I was actually referring to the time Shamus mentioned and not the here and now.

The problem described in the article is that segmentation took place already somewhere in the past. I was merely trying to point out that all was well as long as there was basically just one thing to look at - the number after "Voodoo".
Others also tried to offer their own solutions, too, but were not really successful until ATI got into the ring. (I still remember owning a Matrox M3D accelerator card which was supposed to be the best of the...whatever. Sadly the only supported big title was Tomb Raider 1 and I can't even recall another game for that)

Even if DirectX offers a more abstract layer for programmers, it wasn't all just marketing back then and developers didn't seem to had the nerve to support several different routines to achieve the same level of eye-candy.

I am not that much into the whole thing anymore, but I am sure that there are games which do need a certain chip to run on maximum settings. I would be really surprised if NVIDIA, ATI & co. did create intelligent hardware that makes specific shader programming using proprietary commands obsolete. So either developers program the same routine for different chipsets by now or the card manufacturers went back to only competing via GPU speeds instead of proprietary extra algorithms.


But maybe the whole thing comes from a wrong perspective altogether and PC gaming declined with the rise of HDTV resolution and online gaming for consoles.
 

onelifecrisis

New member
Mar 1, 2009
165
0
0
Good article, but I'm not sure I completely agree with it. It seems you're saying that the PC "killed" itself, rather than being "killed" by consoles, and that's the bit I don't agree with.

I'm a PC gamer through and through, and I doubt I'll ever switch to consoles (at least not until they start shipping them with mice) but even I have to admit that consoles are cheaper, more accessible, and more robust and stable than PCs. It's also easier to develop games for consoles, and much easier to QA them. I personally don't much like console controllers but I can see how many people, especially kids, would prefer them to a keyboard and mouse - especially for certain types of games (fighting games and modern variations on "the platformer" spring most readily to mind). And IMO none of these things lose any of their significance when I factor into the equation the fact that PC's have become divided into "normal" and "gaming" rigs since the invention of the GPU. I'm not saying that the invention of the GPU played no part, but I really don't think you can point to it as the primary cause (let alone the sole cause) of the PC's downfall.

Some more (somewhat related) comments, if I may:
I said I'd never switch to consoles until they start shipping with mice, but that's an oversimplification. Although I'm not keen on (current) console controllers, my real problem with consoles is actually the games themselves. I think I'm not alone in this, given the number PC gamers who describe console games (and console ports) as "dumbed down". But consoles have only (relatively) recently risen to the forefront of gaming, and gaming itself has only recently risen to the forefront of the entertainment industry, and so now we enter into an interesting time: the console gaming generation are growing up, and I wonder whether they will grow out of games or not. If not - meaning, if a significant number want to carry on playing games into their adulthood - then we may see a shift in the consumers towards console games that aren't so "dumbed down", and then it would only be a matter of time before the industry responded to that shift. If and when that happens I just might be tempted to make the jump to console land.
 

onelifecrisis

New member
Mar 1, 2009
165
0
0
HeartAttackBob said:
While Shamus makes a damn good point, and I largely agree with him, there is still a decent segment of gaming that relies exclusively on computers.
How many millions of players is World of Warcraft up to? Thirteen? Fourteen? And not a console in sight.

Just because our beloved GTA games (and others) tend to treat the PC version with a level of disdain roughly equivalent to what Yahtzee feels for Quicktime events doesn't mean that PC gaming is dying. More like mutating. True, we're likely to get some yellow redneck supermutants and severely overgrown homicidal cockroaches, but we may also get some friendly and humorous (if hideous) ghouls out of the mix.

We're also seeing consoles (particularly the 360 and ps3) move closer and closer to PC level functionality: playing movies, connecting to the internet, I've even heard of people installing Linux on their PS3... although that probably sets off the "Nerd!" alarm installed in most brains. And from several reports, the 360 performs Seppuku with the same high frequency as any modern PC.

True, the entry requirements to mainstream PC gaming are high, arguably even higher than those to console gaming, but we (The PC Gaming Master Race) have to distinguish ourselves from those ancient grandmas and yapping kiddies who spend their time flailing Wiimotes around their living room... Right?
LOL, good post. But I have to point out (sorry) that "Quicktime" (capital Q, one word) is a bit of software made by Apple, and "quick time events" is what you meant to say. I figured that (as a member of The PC Gaming Master Race) you'd want to know. ;)
 

UnSub

New member
Sep 3, 2003
55
0
0
To those who think that WoW / Blizzard is in any way the example that shows PC gaming is a-okay, please take a look at this list of PC game sales for Feb 2009 [http://www.shacknews.com/onearticle.x/57632]. Blizzard has 6 spots with 3 for WoW and 3 split by Warcraft III, Starcraft and Diablo. Yeah, Diablo. The series is a good game, but that box first released in 2003 (according to Amazon). Diablo II launched in 2000. Let's not even talk about how old Diablo is. We've got a game that is 9-ish years old yet is still in the Top 20 PC monthly sales.

Then take a look at the other titles. The Sims takes up 5 spots. So 11 out of the top 20 sales for PC games in that month come from two companies and are franchises. Looking at the rest of the titles I don't see much new blood coming in. Left 4 Dead, two Spore titles, that's about it.

Yeah, digital distribution comes into it, but PC developers still want games on shelves because not everyone has access to high-speed internet that will let players download several GBs for a game. It might be growing, but it isn't as important for mainstream game releases as some might want to think.

The PC games market is stagnant because the growth is in only two areas: MMOs and casual games (perhaps browser-based or download-light titles). If a developer is creating something outside of that range, they usually look at consoles or cross-platform options. Piracy on the PC certainly doesn't help them head towards PCs either.
 

shaderkul

New member
Apr 19, 2009
73
0
0
Kojiro ftt said:
I think it could have been avoided if the marketing of graphics cards didn't get so out of hand. In the beginning, it was Voodoo 2, 3, 4, etc as Shamus has mentioned. But then marketing got involved in naming the chipsets. Next thing you know, you can buy a GeForce 3 and it would actually be WORSE than your GeForce 2, because you bought the retarded GS version, or whatever the tag is they came up with that week. That's when the market became unnavigable. You couldn't just say "I need a better card" and find one with a bigger number than the one you already had. You have to research stuff so you don't get hoodwinked by marketing. To GeForce and ATI, I say a big "Fuck You" and good riddance.

Now I just wish consoles and their games would natively support keyboard and mouse input...
I totally agree with this: I blame the graphics card manufacturers for creating such an unholy mess of chipsets and marketing bullshit.
 

onelifecrisis

New member
Mar 1, 2009
165
0
0
scobie said:
A nice article and some good points. Seems to fit pretty well with my own experiences. My computer is most definitely a "regular" computer, and I am for the most part a console gamer because I know my 360 will be able to run the games I want without fuss and they'll look pretty. When I play on my computer it's because I want to play a game that either hasn't been released on a console or simply won't play well without mouse and keyboard, mainly strategy games (unlike a lot of PC gamers I have no problem with console FPSs).

onelifecrisis said:
My real problem with consoles is actually the games themselves. I think I'm not alone in this, given the number PC gamers who describe console games (and console ports) as "dumbed down". But consoles have only (relatively) recently risen to the forefront of gaming, and gaming itself has only recently risen to the forefront of the entertainment industry, and so now we enter into an interesting time: the console gaming generation are growing up, and I wonder whether they will grow out of games or not. If not - meaning, if a significant number want to carry on playing games into their adulthood - then we may see a shift in the consumers towards console games that aren't so "dumbed down", and then it would only be a matter of time before the industry responded to that shift. If and when that happens I just might be tempted to make the jump to console land.
Interesting idea. However, I'd like to make the point that a great many console owners are already adults. Now, for whatever reason, consoles are the platform of choice for the mainstream gamer. This means that by their nature consoles offer games that are simpler and easier to get into (not necessarily a bad thing, it depends on your tastes). I'm not sure how relevant the age demographics of console gamers are. I'd also like to make the point that the complexity of a game's interface and the commands that can be input, and therefore the complexity of the game itself, is necessarily limited by the fact that most console gamers will always play their games with a simple console controller. This might be a factor that does not depend on intentional "dumbing down". There's still no excuse for not releasing Shadow of Chernobyl on consoles, though. Grrr.
I agree that games are limited by their controllers, and perhaps you're right about there being a lot of adult console gamers (anyone got some hard stats on this?) but I hope you're wrong!
 

sgtshock

New member
Feb 11, 2009
1,103
0
0
Interesting article. GPUs propably had at least something to do with the downfall in popularity of PC gaming. However, it seems to me that the GPU market hasn't been pushing itself like it used to. A few years ago, i seem to remember thinking "Good God, they have the _000 series out now? Now I'm even more obsolete!" But lately, I haven't heard much on the GPU front. It probably has a lot to do with the current console generation doing so well, and the next generation nowhere in sight.

With games getting more and more expensive to make, I think hardware developers are getting more hesitant to push the bar higher. Hopefully this will give people time to "catch up" so to speak on hardware, or even better, encourage developers to develop less demanding games, breathing new life into PC gaming.