Abnaxis said:
I suppose we just need to chalk this one up to differing experiences then. From my perspective in the US, I noticed probably 4-6 consoles for every "home computer," I ever saw, and the computers I had experience with were way more fiddly. Also, not the first one could connect to a television.
You never saw any home computers that could connect to a TV set? That does seem unusual, as that was typical of computers designed for the home market, as dedicated computer monitors were quite expensive. Most models outside of those designed for business were able to connect to TVs.
Abnaxis said:
Aardvaarkman said:
P.S:
Abnaxis said:
All the issues (compatibility, useability, exclusivity, adaptability) were about a millions times more important in the day than they are now, from a consumer standpoint.
I'd strongly disagree with this one. At hat time, there was no "standard" Operating System or hardware for home computers. There was a multitude of competing proprietary systems - far more than there are today. And "usability" wasn't something that was taken for granted. People expected that they had to learn how to use something like a computer. People even expected to have to build their own hardware, or learn to program them.
It was like the days when a car owner would expect that they would have to learn how to do mechanical maintenance in order to be a driver.
I'm confused, because you say you disagree, then reiterate my point. Plug and Play standards were a gleam in Bill Gates' eye.
Now, I'm really confused, because plenty of home computers has plug-n-play hardware well before Gates and Microsoft had any influence on the industry. The IBM-PC and the hardware that Microsoft supported were years behind the rest of the industry when it came to anything "Plug'n'Play." In fact, it was often the proprietary systems that enabled easy connectivity (of course, Microsoft's software is also proprietary, but has become considered something of a proprietary standard).
From memory, Plug'n'Play was Microsoft's effort to catch up, and the name was incredibly ironic, because it didn't work as least often as it did.
So no, compatibility was not such a huge issue back then, because nobody expected different brands of computers to be compatible with one another. And usability wasn't such a huge issue, as most electronics has pretty serious usability issues - whether console or home computer. That is not the case in today's age of the easy-to-use smartphone.
Abnaxis said:
Getting a mouse to work in a game usually required an hour of fiddling and cussing.
Maybe on Microsoft or MS-DOS systems it did, but on other systems it was usually a lot easier.
Abnaxis said:
When faced with the choice between swapping out floppies on a beige box that may-or-may-not work with your non-standard hardware after hours of fiddling, or slapping the cartridge into the Atari and playing on your couch with a joystick within minutes, most people went for the Atari. Compatibility, useability, exclusivity, and adaptability all gives consoles an advantage over PCs now, but the advantage was WAY BIGGER before.
No, it wasn't. Before MS-DOs and Windows took over the PC market, that stuff was a lot easier. In fact, you could plug your Atari joystick into most home computers and it would just work without any special configuration. And you could just slap a game cartridge into most of them, and get a better game than you would with the Atari.
The usability advantage of the console really wasn't that big, as consoles at the time also tended to have plenty of their own problems with flaky hardware and troubleshooting. But the capability advantage of the home computer was massive compared to the console. It opened up whole new worlds of possibilities. Meanwhile, today's smartphones and consoles can do most of the things that dedicated computers can do, such as browsing the web and watching streaming video - so there isn't such a big advantage to owning a computer.
Abnaxis said:
Also, horse puckey to "every owner was expected to have to learn." That was only true because the only adopters were enthusiasts.
Absolutely untrue. Plenty of people ended up learning computing who were not already computer enthusiasts.
It was completely normal for families who were not nerds, geeks or enthusiasts
in any way to buy home computers. These were marketed to ordinary families. Just look at the sales figures of home computers from the 80s. They are way too big to only represent enthusiasts. If they were only for enthusiasts, they would have only been for sale at specialist computer stores. But they weren't. You could buy them at the kind of mainstream department store where you would buy kitchen appliances.
Abnaxis said:
Again, the computer gaming market certainly existed back then, but I really don't think we can lay the blame for the Atari crash to "people just bought C64s instead."
But you also can't just lay the blame on "quality control" or any other single factor. It was a multitude of factors, including the economy, and the home computing market. I'm not sure why it needs to be simplified to a single factor, when there was a whole bunch of stuff going on.