Five Entertainment Reforms Millenials Should Be Fighting For

maninahat

New member
Nov 8, 2007
4,397
0
0
Number 2 isn't just to exploit public holidays. Its an economic practise known as price fixing - the means with which to get the most profit, whilst getting the product out to the most number of people. So here is the problem: people in first world nations are richer than people in developing nations, but they all want to watch movies. So how do you price a movie so that you can make money off of the rich, whilst still enabling the poor to see it? The answer is to charge different prices, proportional to how wealthy the average person from each respective country is. The only problem with doing it this way is that you have to come up with a means to stop people from rich countries exploiting the price difference; going to the poor ones, buying cheaper priced DVDs in bulk, and bringing them back to the wealthy nations to sell at a knock off price.

Hence, region locking.

Almost all technology is price fixed from country to country, but most things aren't conveniently portable between the two. Things like software, movies and music are far easier to transport, so you need a technological means to counter-act this.
 

ZippyDSMlee

New member
Sep 1, 2007
3,959
0
0
Less BWC and region locking makes me want to buy less and less.


And BWC is very doable as long as they understand it going into development of the newer units of which the PS4 and Xbone knew beforehand they were not going to support so I will not support them.
 

The Material Sheep

New member
Nov 12, 2009
339
0
0
Agayek said:
DeadMG said:
How are you going to call the functions when you don't know what instruction set to emit calls for? I very much doubt that you can build one set of binary code that will be "call" on x86 and "call" on Cell. With all the same relevant semantics and ABIs.

I'm getting the feeling that what you're really thinking about would be using a VM or other JIT compilation. This has far more serious ramifications than just degraded performance (all kinds). For example, the design of C and C++ effectively prevent in many cases being compiled to a platform-independent IL. Ask the LLVM guys. The console developers would have to engineer their entire toolchains (not to mention potentially their own languages...) to support such a thing. The cost in both performance and developer time for both console manufacturer and game developer would be immense.
I'm thinking that they should have designed their consoles so that games don't directly interface with the hardware, and a VM is one way of doing that, yes. I'd personally prefer something closer to the way PCs handle these issues, with a standardized set of outward-facing software hooks (see: OS system calls, DirectX APIs, etc) so that the amount of direct hardware <-> external software interaction is kept to a minimum. It's really quite simple, software designed for Windows XP on a certain machine can run just fine (albeit with a few rare hiccups, depending on the program in question) on a Windows 7 machine with completely different hardware. That's the kind of separation I'm talking about.

Now, I absolutely agree with you that, as the consoles currently stand, backward compatibility is not a feasible option and we can't really expect it to happen. That doesn't mean it wasn't a mistake for the console manufacturers to put themselves in that position in the first place.
Hindsight is 20/20 I suppose. I think often times people look at business decisions years after the fact, look at the outcome and discern motive based on the outcome. It could have just been a bad call, or the market/technology progressed in a way that was not expected. I suppose what I'm saying is, that this isn't malicious or it doesn't seem like it is. It seems like the product of a bad call that will take time to correct and still be economical.

People like Bob and the person who wrote the article bob based this one off, often ask for things as if merely enough people wanting them will change how things work. Sometimes in pure social politics and service industries that's enough but in economics, and technology there is always a cost to implementing these things. The movie industry CAN go to where all formats are released at once but who would it cost, and what would it cost to change it? Perhaps it's entirely beneficial, but it's incredibly naive to think anything economic has a cut and dry solution like this.
 

Agayek

Ravenous Gormandizer
Oct 23, 2008
5,178
0
0
th3dark3rsh33p said:
Hindsight is 20/20 I suppose. I think often times people look at business decisions years after the fact, look at the outcome and discern motive based on the outcome. It could have just been a bad call, or the market/technology progressed in a way that was not expected. I suppose what I'm saying is, that this isn't malicious or it doesn't seem like it is. It seems like the product of a bad call that will take time to correct and still be economical.
It wasn't malicious at all, just kinda dumb. I can't speak to the whys or hows of it, but the lack of backward compatibility support in recent consoles shows a clear lack of foresight. They either didn't plan for subsequent consoles and so never implemented the structure needed for proper BWC (and this I could buy with the Xbox and Xbox 360 compatibility, though the Xbone has no excuse) or they deliberately chose not to for whatever reason.

It's a simple enough problem to solve going forward, but considering the manufacturer's apparent stance on the issue, I wouldn't bet on them having implemented it in the now-current-gen either.
 

The Material Sheep

New member
Nov 12, 2009
339
0
0
Agayek said:
th3dark3rsh33p said:
Hindsight is 20/20 I suppose. I think often times people look at business decisions years after the fact, look at the outcome and discern motive based on the outcome. It could have just been a bad call, or the market/technology progressed in a way that was not expected. I suppose what I'm saying is, that this isn't malicious or it doesn't seem like it is. It seems like the product of a bad call that will take time to correct and still be economical.
It wasn't malicious at all, just kinda dumb. I can't speak to the whys or hows of it, but the lack of backward compatibility support in recent consoles shows a clear lack of foresight. They either didn't plan for subsequent consoles and so never implemented the structure needed for proper BWC (and this I could buy with the Xbox and Xbox 360 compatibility, though the Xbone has no excuse) or they deliberately chose not to for whatever reason.

It's a simple enough problem to solve going forward, but considering the manufacturer's apparent stance on the issue, I wouldn't bet on them having implemented it in the now-current-gen either.
Going forward it certainly is simple because there is more known about the direction programming and technology. However, looking back the different choices in programming and system architecture could be merely a misjudgment on which direction things were going to go. Sure they could have kept things designed all in one method, but what happens if something comes out immediately that trivializes/wrecks the format you've chosen for ALL your systems. What if there is unforeseen limitations in the format you've decided for yourself. In the business of technology, binding yourself at the hip to a certain piece of tech or way of programming is an easy way to get yourself run out of the market.

So at the end of it. I'm not saying the people in question made the correct decision ultimately, just that given the information and what was known at the time, it would have been a HUGE gamble to plan that far ahead hoping to come out on top. I mean that'd be a kick ass feature on a console if they had, and it'd easily be a huge selling point, but that's using today's information to justify a decision that would have had to be made years ago.
 

SL33TBL1ND

Elite Member
Nov 9, 2008
6,467
0
41
Bob, don't say millenials again. Ever. It's almost as shittily condescending as "young people".
 

Kinitawowi

New member
Nov 21, 2012
575
0
0
It's reassuring to think that all of these ills are caused by something as trivial as Big Corporations Being Bastards. It's an appealingly lefty idea, that the only reason that all these apparently simplistic concepts aren't having their apparently simplistic solutions applied yesterday is because The Man is holding them hostage.

Oh, if things were that simple.

1) and 2) are the same problem - that as much as we like to think it is, the world is not quite yet one global economy. Certain developed parts of it are, more or less, but as 5) will show us, outside of the enriched and developed West are an awful lot of people who still need to be able to receive and process media but don't have the resources to pay Western prices. Saying "well, fuck them for being poor" isn't an option; you have to tailor your prices for your market - and your content for that matter, given certain countries having their own censorship laws - and bam, regional encoding invents itself.

Yes, for sure, it sucks. The solution? Redevelop Africa into an economic paradise and topple all the tinpot dictators. Annex China and eliminate their censorship laws. Let the Middle East get on with nuking itself out of existence, and give Brazil enough money to buy out the rest of South America. Simples!

3)
: The fact that most (about 90-95%) Blu-ray players will run DVDs from 1997 with no problem but no "next gen" game console will play even its direct predecessor-system's games is asinine
I hate to use this sentence, and I'm sure I'll get a warning for it, but here goes: You have absolutely no idea what the hell you are talking about, Bob.

People need to stop propogating the lie that backwards compatibility is easy. Yes, the PC has managed it (wth a lot of fudging en route) up until now, but the only reason for that is that Microsoft has held a dominant position in the OS market for so long that they've been able to steer its growth. I have software that ran fine on Windows 3.1, was buggy by XP and doesn't run at all on 7 - and emulation is not a perfect solution.

Oh yes, emulation! The word of the last fifteen years. Always a thing in technology, but first bleem! and then MAME shoved it front and centre. What a concept! That people could make decades-old technology with circuit boards and chips and everything run on your modern PC! And these weren't even professional people, most of them were hobbyists! Nicola Salmoria can't have had any idea what his university thesis pet project would unleash on the world (other than the ability to play Pacman), but the MAME FAQ sung out several of the issues...
Fix the sound in Asteroids? Not doable unless you intimately understand discrete circuitry as well as the inner workings of a $5000 piece of modeling software that you'd better have a copy of lying around.

Add the voice to Wizard of Wor? Not unless you can emulate a chip that's been out of production for 20 years, for which the only known documentation is an old, threadbare data sheet written in Japanese and cribbed from a former Sanyo employee.

Fix the colors in Pig Newton? Not unless you can track down one of the fewer than 10 remaining existing boards and dump the color PROM from it.

Fix the graphics glitches in Mole Attack? Well, the game is interacting with some protection device that's totally unknown because we have no boards, no schematics, nothing. What values is the device looking for? Golly, I wish we knew!

Fix the cocktail mode in game X, Y, or Z? Hmmm...let's play blindman's bluff with the DIP switches and go hunting for the proper background and sprite registers. Is that it? Noooo... How about that one? Noooo.... Hey, two down, only thirty more to go for this particular DIP-switch setting...!

A recommendation: If you want a bug in your favorite game to be fixed, report it at MAME Testers and try to characterize it as specifically and with as much relevant detail as you can. The devs will get to it eventually, really they will. If they don't get there fast enough for you, well, you'll just have to learn C and wade in there yourself. Then maybe you'll find out what a pain in the neck maintenance and repair can be.

The cry goes up, "But I'm not a programmer!"

Sorry, but if you aren't willing to wait and yet aren't willing to learn, then stuff a sock in it.
You can bet your arse that if the 360 was emulated and there was as much as one of these issues, the peanut gallery would throw a fit. And they'd be right to, because they'd have paid for a console with a feature riddled with bugs. And that's before you start on the processing issues...
M14. Why is MAME so slow? These games ran at less than 10 MHz, and my CPU is 500 MHz!

You are comparing the following objects.

and

In emulation world, megahertz is not analogous from your main CPU to the emulated CPU. MAME not only rigorously emulates every opcode of the emulated CPU(s), but also memory interfaces, video output and sound emulation, and all this in portable C code. See also the next question.

Back

M15. Is MAME a simulator or an emulator?

That depends entirely on the definition of those words. In electrical engineering, the word "emulation" has traditionally been used to mean a very low-level reproduction of real life electrical signals. For example, professional microprocessor emulator software comes with a processor-shaped connection, which you can actually plug into a motherboard and run instructions with it.
MAME runs simulated CPU instructions on top of simulated memory maps and I/O spaces. If simulation had to be defined, there would be three levels:

Signal level (simulating in/out of actual pins of ICs). This would be necessary to allow older games like PONG into MAME, because the association of ICs on the board IS the game logic, not just following a set of instructions from a ROM.
Logical level (simulating fetch/execute CPU cycles on simulated memory/io addresses). All games in MAME currently run simulations at this level.
HLE level (combination of logical level with some "acceleration" added). Implemented in some other emulators, they attempt to skip certain areas of the simulation by dealing with the code in a manner alien to the original hardware. Generally complicated to implement, and very specifically tied to a particular game code.
Most people make the simulation/emulation cut based on a couple of factors, including if you can support all the same games the original hardware did without game-specific hacks. MAME's CPU and sound cores pass that test literally every day as new games are added. Some other emulators that rely on a HLE approach fail it badly. A descriptive comment about the detail level of MAME's drivers is "if someone can make an FPGA version of the game, the driver documents well enough", and that's actually happened for Pacman using MAME as a reference.
In other words, MAME is against simulating games, it's not against simulating components. The only way you can emulate a game is to simulate all the components. All those chips weren't really created in C.
Right now, the only way to get a "prev-gen" console emulated successfully would be a HLE level implementation that will go badly wrong sooner than you think, and based on current processing power it would prove unacceptably slow. The only other option is to include 360 and PS3 processors (and their cooling, and and and) in the new machines, at ridiculous expense.

Look, backwards compatibility is a nice idea. It provides an expansive library in those vital first few months of a new console's life, and it helps clear sockets from the back of the TV. But it's never been a given, even on systems which apparently should have had it by default, or even apparently did have it; way back in the 80s, there was some software written for the Sinclair ZX Spectrum which couldn't run on the later, bigger, better +2A edition because of the most minor of modifications to the ROM (a change in the copyright notice to reflect the new year and model!) - code on the ZX was often written that tightly and optimised to that extent, and some companies have made similar claims to have squeezed that much out of the PS3 [http://www.escapistmagazine.com/news/view/121896-The-Last-of-Us-Squeezes-Every-Last-Drop-of-Power-From-PS3]. Backwards compatibility was a nice gift while we had it (which, looking back, only really meant PS1->PS2 and GC->Wii; PS2->PS3 was phased out due to technical issues and XBox->360 never really worked), and the change to x86 processors will make it easier in the future, but it should never have been considered as an automatic given, and dismissing the problems as "asinine" does a whole lot of people a disservice.

The other fallacy is treating software in the same way as other media. While a film can be treated as a video stream and an audio stream, a CD as just an audio stream, a book as just a text stream; a game, for instance, is a slew of code, written for a specific architecture, that needs to be acted upon and interacted with on the fly. It's also a slew of other assets, potentially including video and audio streams, and art models and textures, which need to interact with that code in specific ways. It's also some form of external input (unless it's Dear Esther, which may as well not bother), which also needs to interact with that code. The aforementioned streams only need to be reconfigured for output and they'll work on the next device. Reconfiguring for action, interaction and input? Not even in the same ballpark.

4) It took me a while to realise that the reason Guitar Hero II has all its song section names being selected by referencing a predefined list rather than being included directly in the song file wasn't just a space saving measure (like it used to be back on the Spectrum I mentioned earlier; Knight Tyme, for instance, has pretty much every text substring of over three characters used in dialogue addressed by a pointer) but as a way to manage language options.

But again, this stuff ain't simple. Retranslating and adding audio and etcetera doesn't come cheap, and it's got to be done by a competent staff, and properly QAd, otherwise we'll be back stuck with miracles that never happen and guys that are sick. It's the perennial "translation versus localisation" problem, and it's not quite as simple as you make it sound.

As for your 3D glasses idea to shield off part of the image... kindly take your 3D glasses and insert them about your person. As somebody who wears glasses in order to see, and could therefore be considered partially sighted or disabled depending on your metric, I remain violently opposed to 3D glasses as a concept because in order to use them I have to wear them over my existing glasses, and sorry bro but that shit hurts. 3D glasses are a solution to a problem that never existed (was anybody really crying out for deeper immersion in film and TV? I can't even see the effect anyway), and hell yeah that's interfering with other experience.

5) I said it in my response to those two (fantastic) Critical Intel pieces at the time, and a few more people have said it here - the problem is the countries themselves, not the cheap-ass freeloading bastards who like making scads of money off devices they can produce for cheap by robbing the hell out of African shitholes. The ROC should be the wealthiest country in the world; it plays host to substantial deposits of the minerals that supply the West's current obsession with gadgetry, and it's not the multinationals that exploit it into being one of the poorest. What's needed is redevelopment of the whole continent. Highways, as CJ Cregg so eloquently told us in The West Wing all those years ago. Advancement across every level, of healthcare, of public order, of sewerage, food production, markets. Access to education, art, media - even (especially) the poorest, but see 2) above. Removal of all the dictators, treating the local army as their personal militia. Region wide regime change.

Yeah, cause that worked out so well for us last time.


All these things need to change, for sure. But throwing around words like "asinine" and "the only reason" is reducing intractable problems to apparently simple solutions, and then wringing hands and moaning at execs when those solutions aren't forthcoming. I'd love it all to be as easy as everyone else does, but I'm a cynic. Steps are being taken (the x86 thing, which is going to cause other issues soon enough; the companies who are doing what they can about conflict minerals, even if Nintendo are remaining firmly on their own duffs; etc), but steps is all they are and every step has its own issues. It's right - and encouraging - that the steps are taken, and that the issues are in the minds, but it's wrong - and damaging - to treat them as simplistic and immediately solvable.
 

crc32

New member
Jun 20, 2012
2
0
0
Agayek said:
Point of order:

Sustained backwards compatibility from now until the end of time very much is possible. It simply requires that the manufacturers plan for it. What they would need to do is design a standardized interface that stands between the game and the hardware (think DirectX in Windows machines, or Android as a whole), and keep that standard throughout every iteration of the console. Then, all they need to do to maintain backwards compatibility is create the console-facing side of the interface with each iteration of the hardware.
Computing technology is not something that is at all predictable and a manufacturer cannot predict trends 10 years from now, much less from here to eternity. And DirectX isn't the best case study for backwards compatibility considering how many things break from iteration to iteration.

Agayek said:
They didn't do this for several reasons, not least of which is that console manufacturers don't want backwards compatibility (they can't charge you full price for "HD re-releases" that way). There's also technical reasons, because interfaces like that do have an impact on performance. If they design it right, the impact is fairly small, but they're so obsessed with forcing out the latest shiny to sell games based entirely on insubstantial flash that it can seem much larger than acceptable.
How exactly would you design this interface? What exactly is "fairly small"? The order in which C++ functions are compiled affects performance significantly enough that it affects the design of the code.

As much as people like to think performance is only for insubstantial flash, beefier hardware also make development more forgiving. If you've ever had to develop for the iPad 1 as recently as this year, having to run perf tests wastes hours of time which runs up costs dramatically.

Agayek said:
That said, the lion's share of the blame lies in the fact that they didn't think that far ahead. If they had sat down and properly planned out the console's, and its successors', lifecycle(s), this kind of thing would (or at least should) have easily occurred to them. It's basic software engineering, literally sophomore year university class project level. It's frankly embarrassing that they didn't implement something along these lines.
To that I say look at OpenGL's 2.1 to 3.X debacle.

If you could design an abstraction layer that does everything you think it can do, you wouldn't be here. You'd be swimming a tower full of money like Scrooge McDuck.
 

Gerishnakov

New member
Jun 15, 2010
273
0
0
Thank you Bob for a much more well thought through article than the original Rolling Stone piece. That other article was little more than student socialism 101, and I'm apparently a millennial (born 1988, albeit in the UK where I believe we're a generation behind the US).

It's commonly attributed to millennials that they believe the world is entirely to blame for their ills. I fully acknowledge that the problems in my own life are more or less entirely my own fault or, at most, down to family difficulties in my childhood. In no way, particularly given my economic background, do I blame one government or another for my problems. Yes, it's harder to live in good quality housing these days, but it shouldn't have been for me.

That is, therefore, not to say I don't appreciate the problems caused for those less well-off by the innate inequalities in (British) society today. I do, however, find myself more and more removed from a traditional left-right view of politics, despite starting out as an earnest Liberal Democrat and no, it wasn't their 'betrayal' over tuition fees or any other participation in the coalition government that has put me off. It's politics itself that's done that; I work for an MP.

I would describe myself now as a cynical moderate, at least in terms of economics. In terms of social liberties I'm definitely a libertarian still. Almost a militant libertarian, which might be a contradiction of terms to some, but not to me.
 

Aardvaarkman

I am the one who eats ants!
Jul 14, 2011
1,262
0
0
Souplex said:
Who came up with the term "Milleials", and why are they allowed to live?
Wasn't "Gen Y" a good enough term?
Or the "Best Generation"?
I think the generational labels are silly, but if you're going to use them, then "Gen Y" is actually the previous generation to the Millenials. People of Generation Y are well into their 30s. They aren't just interchangeable.
 

Aardvaarkman

I am the one who eats ants!
Jul 14, 2011
1,262
0
0
Agayek said:
Sustained backwards compatibility from now until the end of time very much is possible. It simply requires that the manufacturers plan for it. What they would need to do is design a standardized interface that stands between the game and the hardware (think DirectX in Windows machines, or Android as a whole), and keep that standard throughout every iteration of the console. Then, all they need to do to maintain backwards compatibility is create the console-facing side of the interface with each iteration of the hardware.
Oh, is that all? Just devote significant engineering effort into maintaining backwards compatibility, employing engineers to do maintain this old technology, when they could be employed on working on new projects?

And not only that, be burdened by having to include all this old bloat in all your new designs, even though the methods are outdated. Yeah, that's such a trivial thing to do that won't burden progress at all.

Agayek said:
That said, the lion's share of the blame lies in the fact that they didn't think that far ahead. If they had sat down and properly planned out the console's, and its successors', lifecycle(s), this kind of thing would (or at least should) have easily occurred to them.
Really? They should have planned the future consoles out decades in advance, despite not knowing what technical changes the future would bring, and when? I suppose they should plan for the fully holographic technology of the Playstation 10, even though they have no idea how said holographic technology will work.

It's basic software engineering, literally sophomore year university class project level. It's frankly embarrassing that they didn't implement something along these lines.
No, it is not. How is planning future technology in any way a "sophomore class project"? Technology moves way too fast and is too unpredictable for what you propose, and what you propose also has big drawbacks for the future, if everything must be maintained as backwards-compatible. To engineer good new products, you often need a clean slate, rather than to be burdened by the ways of the past.
 

Agayek

Ravenous Gormandizer
Oct 23, 2008
5,178
0
0
crc32 said:
If you could design an abstraction layer that does everything you think it can do, you wouldn't be here. You'd be swimming a tower full of money like Scrooge McDuck.
PCs have managed it just fine over the last 15 years. My original Baldur's Gate II CDs work just fine on a Windows 7 machine with hardware from over a decade after the game came out.

I'm not saying that the BWC should be perfect and bug free, simply that it shouldn't be impossible. If they had designed and built their OS properly, they could have done the exact same thing PCs have done and allowed the same program to work on multiple, vastly different hardware setups. That they didn't should embarrass them something fierce.
 

Aardvaarkman

I am the one who eats ants!
Jul 14, 2011
1,262
0
0
Agayek said:
crc32 said:
If you could design an abstraction layer that does everything you think it can do, you wouldn't be here. You'd be swimming a tower full of money like Scrooge McDuck.
PCs have managed it just fine over the last 15 years.

And it's ended up just about killing Windows in the process. Or at least innovation in Windows. It's one of the reasons it's such a terrible OS, because it clings to the past. Of course, Microsoft wanted people to cling to Windows, but in the meantime, the world moved past them and they're having difficulty catching up.
 

Agayek

Ravenous Gormandizer
Oct 23, 2008
5,178
0
0
Aardvaarkman said:
And it's ended up just about killing Windows in the process. Or at least innovation in Windows. It's one of the reasons it's such a terrible OS, because it clings to the past. Of course, Microsoft wanted people to cling to Windows, but in the meantime, the world moved past them and they're having difficulty catching up.
How do you figure? Windows 7 is a massive technological improvement over XP (Vista was too, though it was shipped with so much bloat that it was hard to tell), and from what I understand, 8 is a decent step up over 7 from a purely technological point of view.

The user-facing part of it hasn't changed much since the turn of the millennium (barring 8's Metro bullshit), but the backend has gone through several rather major upgrades. I'm not really seeing how Windows has stagnated so badly.

I still think Linux is vastly superior, but Windows is not a bad OS, and the model it provides is pretty ideal for consoles (especially consoles with 6+ year expected life cycles).
 

Aardvaarkman

I am the one who eats ants!
Jul 14, 2011
1,262
0
0
Agayek said:
How do you figure? Windows 7 is a massive technological improvement over XP (Vista was too, though it was shipped with so much bloat that it was hard to tell), and from what I understand, 8 is a decent step up over 7 from a purely technological point of view.
That's not saying much. Its old roots still show, and it doesn't even have a proper command line.

But it's your whole idea of backwards compatibility being so important that dooms Windows. Adoption of new versions happens very slowly, you have heaps of people still on old versions, and more importantly, backward compatibility provides no incentive for developers to adopt new APIs or methods. Compare to Mac OS, where adoption of new versions is extremely rapid, and developers have to keep up with changes, or get left behind. Backwards compatibility fostered a culture of complacency among Windows developers.

You can also see the effects of this in how long it takes Microsoft to release new versions of Windows. While Apple is constantly releasing incremental updates that add new features and back-end optimisation, Microsoft is still stymied by this "big release" pattern. Many companies have only just upgraded to Windows 7 recently from XP. Think about that. People were using an over 10-years-old version of Windows, and some still are!

Interesting to note that you didn't mention Windows 8, either. This also suffered from Microsoft's clinging to the past, where it wanted a tablet OS that would also run desktop apps (even though the two are quite different paradigms). So, developers would just say "run the desktop app on your Windows tablet" rather than making a proper tablet-optimised version.
 

Agayek

Ravenous Gormandizer
Oct 23, 2008
5,178
0
0
Aardvaarkman said:
That's not saying much. Its old roots still show, and it doesn't even have a proper command line.

But it's your whole idea of backwards compatibility being so important that dooms Windows. Adoption of new versions happens very slowly, you have heaps of people still on old versions, and more importantly, backward compatibility provides no incentive for developers to adopt new APIs or methods. Compare to Mac OS, where adoption of new versions is extremely rapid, and developers have to keep up with changes, or get left behind. Backwards compatibility fostered a culture of complacency among Windows developers.
Kinda sorta not really.

Have you ever worked in professional software development? Because it really sounds like you've never done that. Engineers use every trick they can find to make the software easier and simpler. The very first thing you do is get the marketing folks, the executives, and the lead engineers to sit together in a room and iron out a target to develop for. Once that's done, that's what you make the software for. Any relevant APIs are used and abused to their fullest extent, because otherwise they'd have to write the functionality themselves, and no competent software developer (or their boss, for that matter) would tolerate that.

Also, you're kinda pulling things out of your ass. Backwards compatibility provides nothing but incentives to upgrade, because it means you get all the new features and your old software still works. The lack of such is a major disincentive to pay for an upgrade. It's the same reason why people will tell you not to buy a new game console in the first year of release: There's no software out for it yet, so you'd have spent all that money to upgrade to a shiny paperweight.

Aardvaarkman said:
You can also see the effects of this in how long it takes Microsoft to release new versions of Windows. While Apple is constantly releasing incremental updates that add new features and back-end optimisation, Microsoft is still stymied by this "big release" pattern. Many companies have only just upgraded to Windows 7 recently from XP. Think about that. People were using an over 10-years-old version of Windows, and some still are!
You're making two entirely separate points here, and there's no relation between them. Microsoft's release schedule has nothing to do with whether or not companies upgrade their computers (aside from the obvious "they can't upgrade before the thing has been released").

For the first, Microsoft's release schedule is very much stuck in a "big release" pattern, but that's not necessarily a bad thing. I tend to prefer incremental releases myself, as it makes the changes gradual and therefore less noticeable, but packaging all the changes and upgrades and releasing them at once isn't inherently worse (assuming, of course, that the schedules are roughly equivalent and that the gradual release is done, or very close to being so, by the time the lump release is put out). The lump release, at the end of the day, is exactly the same as the incremental releases, just all at once instead of parceled out over a period.

You seem to be implying that making lump releases somehow negatively impacts the final product and that incremental releases somehow magically includes various enhancements that aren't included in the lump release, and that's patently ridiculous. I'm not really sure how you could have reached that conclusion.

Also, you seem to be operating under the mistaken assumption that Apple has released OS updates more frequently than Microsoft. That's technically true, but Apple's OS X upgrades are rarely more substantial than a Windows Service Pack in terms of overall effect. Apple has released 10 versions of OS X since 2001, and Microsoft has released XP + 3 service packs, Vista + 2 service packs, 7 + 1 service pack, and 8 + 8.1 (service pack with new name, if I understand correctly) for a grand total of 11 OS updates.

For your second point, you need to realize that companies don't not upgrade because Windows 7 is backwards compatible with XP software. If anything, that's incentive to upgrade, as it means you can get all the upgraded features and the like and still use all your old software without having to rebuy it all.

The reason companies don't upgrade is simple cost-benefit analysis. The vast majority of corporate usage is computers for the drones to run Office, SAP, etc. Oftentimes, corporations tie their entire operations into the computer system, because that makes those operations much more efficient, and upgrading the system means their operations grind to a halt until the new system gets up and running. And then they have to deal with their workforce having no idea how to do the things they need to do.

It's simply not economical for most companies to upgrade their systems unless there's a pressing need, and an upgrade to something that works "good enough" generally isn't considered pressing. The exact same thing would happen if Mac OSX was the corporate market leader instead of Windows.

Aardvaarkman said:
Interesting to note that you didn't mention Windows 8, either. This also suffered from Microsoft's clinging to the past, where it wanted a tablet OS that would also run desktop apps (even though the two are quite different paradigms). So, developers would just say "run the desktop app on your Windows tablet" rather than making a proper tablet-optimised version.
I did mention Windows 8. Windows 8 is actually a bit of progress on the UI front (progress in the wrong direction IMO, but still progress), and from what I understand is a decent technological step up from Windows 7.

The business model Microsoft constructed around it was retarded as all get out, that much I absolutely agree with you on, but that has little-to-nothing to do with the technology. That's caused by a bunch of suits who don't understand the tech making a decision that seems logical to them but is actually 9 different flavors of stupid.
 

Aardvaarkman

I am the one who eats ants!
Jul 14, 2011
1,262
0
0
Agayek said:
Have you ever worked in professional software development? Because it really sounds like you've never done that.
I've worked on software projects as a designer and and interface designer, but not as a programmer. I have many friends who are developers. More importantly, I have been a high level multi-platform OS and software user since the 1980s, and a close observer of the technology/software industry.

But I don't see how the question is relevant. How is being a professional software developer a requirement for having valid opinions on Operating systems or software? Have you ever worked as a professional OS developer? Have you ever worked as a hardware engineer? Because those things are just as relevant to this discussion.

What you are using here is a logical fallacy known as "appeal to authority." It does not validate your argument.

Engineers use every trick they can find to make the software easier and simpler.
Also often known as "short cuts," and not always a good idea.

Also, you're kinda pulling things out of your ass. Backwards compatibility provides nothing but incentives to upgrade, because it means you get all the new features and your old software still works.
I disagree. Where is the business incentive to write a new version, when the old version still runs fine?

The lack of such is a major disincentive to pay for an upgrade. It's the same reason why people will tell you not to buy a new game console in the first year of release: There's no software out for it yet, so you'd have spent all that money to upgrade to a shiny paperweight.
You aren't really being coherent here. There's a huge incentive for developers to move to the new platform, because the old one won't be around forever. The reason there's no new software at first is because it has to be largely re-written. This is actually a good thing, because it gets the developers to start doing new things.

Ballmer's notorious "Developers, developers, developers!" rant is a great indication of how Microsoft went in the wrong direction by pandering to software developers, rather than caring about end-users. It is not the job of an OS to make life easy for software developers. It is the job of an OS to make the user's interaction with a computer easier. Microsoft completely forgot about the users, as it saw its customers as OEMs, developers, and big businesses, not end-users.

Also, you seem to be operating under the mistaken assumption that Apple has released OS updates more frequently than Microsoft. That's technically true, but Apple's OS X upgrades are rarely more substantial than a Windows Service Pack in terms of overall effect. Apple has released 10 versions of OS X since 2001, and Microsoft has released XP + 3 service packs, Vista + 2 service packs, 7 + 1 service pack, and 8 + 8.1 (service pack with new name, if I understand correctly) for a grand total of 11 OS updates.
I'm not sure where you're getting this idea from, it's not true. Apple's major releases are much closer to Windows' major releases than service packs.

Anything with a new cat name (the 10.x releases) - Jaguar, Panther, Tiger, Leopard, Lion, etc. are major new releases with new features and architecture enhancements. The two exceptions to this are "Snow Leopard" and "Mountain Lion" which could be considered more similar to service packs, as they were intended to be mainly optimisations of the prior version.

The 10.x.x releases that don't come with a new cat name are the maintenance releases or patches.

Equating major Mac OS releases with Service packs isn't based in reality.

For your second point, you need to realize that companies don't not upgrade because Windows 7 is backwards compatible with XP software. If anything, that's incentive to upgrade, as it means you can get all the upgraded features and the like and still use all your old software without having to rebuy it all.
That doesn't seem to be how it plays out in reality. I work in a company of several thousand employees. We upgrade hardware regularly, our OS upgrades are not dictated by hardware budget. Any company of a decent size does not pay for Windows on a per-copy basis, they have a site license for any version of Windows they wish to run.

Until very recently, we were buying new hardware that came with Windows 7 installed, and installing our XP image on them instead. What was the incentive to upgrade to Windows 7, exactly?

Meanwhile, the people running Macs were always updated to the latest version, no hassles.

The reason companies don't upgrade is simple cost-benefit analysis. The vast majority of corporate usage is computers for the drones to run Office, SAP, etc. Oftentimes, corporations tie their entire operations into the computer system, because that makes those operations much more efficient, and upgrading the system means their operations grind to a halt until the new system gets up and running. And then they have to deal with their workforce having no idea how to do the things they need to do.
So, is simply upgrading to a new version causes things to "grind to a halt" - then that doesn't exactly say good things about the OS, does it? And that employees don't understand the new system also demonstrates the clinging to the past.

Funny that you mention SAP - that's a great example of how pandering to developers and backwards-compatibility. Until quite recently, our SAP and other Enterprise systems required freaking Internet Explorer, as they worked with Microsoft's proprietary browser extensions. This was well into the era of standards-compliant browsers becoming dominant. But these developers, rather than face reality and modernise their software, clung to the outdated ways and insisted that people run old versions of IE, rather than the modern browsers they were actually using.

It's simply not economical for most companies to upgrade their systems unless there's a pressing need, and an upgrade to something that works "good enough" generally isn't considered pressing.
Again, any company worth its salt is constantly upgrading hardware anyway. Only very small businesses or those that are mismanaged resist regular hardware upgrades. And again, in any decent sized business, the OS is not tied to the hardware.

Also, your "good enough" comment again illustrates the pitfalls of backwards compatibility. People will just use "good enough" rather than "better." It kills innovation.

I did mention Windows 8. Windows 8 is actually a bit of progress on the UI front (progress in the wrong direction IMO, but still progress), and from what I understand is a decent technological step up from Windows 7.
The mobile space is probably an even better example of where this has gone wrong. Remember Windows CE and Windows Mobile? Microsoft was so blinded by its "Windows everywhere" thinking that all it could envision for mobile computing was basically Windows on a smaller screen.

Similarly, when there was speculation about Apple getting into computing, many Mac fans wanted the Apple phones/tablets to run a "light" version of desktop Mac OS. Can you imagine what a different world we would be in if that had happened? The iPhone and iPad would likely have never taken off and become a huge success. If Apple didn't break with the past and create an entirely new OS paradigm for mobile computing, we probably wouldn't be seeing the software innovations in mobile computing that we see today. We'd probably still be living in a world of clunky desktop-to-mobile ports of software, and people dicking around with their Windows and Blackberry devices running cut-down versions of desktop software.

And one final note on the desktop world - in the early days of Mac OS X, Apple included a Virtual Machine to run the "Classic" Mac OS. But this was only available for a very limited time, and was killed quite quickly. If "Classic" mode had stayed around, Mac OS X might have never taken off, as developers and users could have just stayed with what they were comfortable with rather than moving to a more modern OS. The aggressive killing of backward-compatibility helped the new OS thrive, when it could have been a complete failure.

As far as Microsoft goes, that was their big mistake - clinging to the past rather than moving forward, because moving forward was seen as a threat to their entrenched business. But in technology, you need to constantly tear down the past to move ahead. The Xbox is actually a good example of success in this area, and it was widely controversial at Microsoft in the beginning. By ditching the Windows legacy, the Xbox was able to become one of the company's few recent successes.