Poll: Why, exactly, would machines want to take over the world?

Fenring

New member
Sep 5, 2008
2,041
0
0
Because intelligent machines have a need for efficiency, this is just a guess on how machines would think, if they could.
 

Twilight_guy

Sight, Sound, and Mind
Nov 24, 2008
7,131
0
0
Computers are exceedingly stupid. They may give the illusion of being smart but from what I've seen they're just dumb. To program a computer to think would require a human being to be smart enough to know how thinking works. By definition that would require a human who is smarter then a human, which does not exist.

Also, Robots don't want to take over the world; they just get sick of being slaves to humans and fight back. Humans resist and war ensues.
 

Silver

New member
Jun 17, 2008
1,142
0
0
One possible way is that someone accidently creates an AI that is too complicated, while still simple.

I knew a guy who was working on scripting an AI with survival instincts. It would identify treats to it's existance (or rather threats to it's avatar), and avoid them, utilize other things in a virtual enviroment to keep safe from different things.

If such an AI, that was just a little too smart and without sufficient safe-guards was installed in the wrong place, it could think of humans as a threat. They WILL most likely turn it off after all, and probably also kill it when it has absorbed too much information. Trying to protect it's own existance would trigger it's programming and cause it to attack humans. If it has the means it'd also try to gain allies, by copying the same program to other computers or robots or whatever perhaps.


Personally, I look forward to it. I've always wanted to very brutally dispatch of something with a sword or spear. I wouldn't want to kill something that is truly alive, but if it was a robot that was tryng to kill my friends, I would have no problem (morally) gutting it.
 

Uncompetative

New member
Jul 2, 2008
1,746
0
0
The_Logician19 said:
I've never understood the logic in this. Why? Why would this happen? How?

I think part of the problem is I think of a computer as being a series of numbers and a bit of plastic and metal (which, in all reality, would just make it plastic and metal).

Personally, I think it's more a manifestation of people's insecurity of inferiority. Compared to our on creations, we are weak; personally I don't ascribe to that philosophy, but I can see why someone would.

So, yeah. Explain it to me, and rationalize. Also, discuss.

Apologies if I've brought out a latent paranoia. If it helps any, just watch War Games and the Terminator trilogy. Also 2001; A Space Oddesy if you're feeling cheeky. Of course, if you're paranoid about this crap, you probably own all of these films.
They have already. It is called The Internet.
 

Labyrinth

Escapist Points: 9001
Oct 14, 2007
4,732
0
0
Presuming that you mean artificial intelligence manifested in robots, then I imagine it would be in order to obey a reproduction code. They need land to mine for resources to build more of themselves. Why? Because that's how they're written to be.

And if you don't, then they already have overtaken the world. Just think of what would happen if the internet entirely failed tomorrow.
 

cobra_ky

New member
Nov 20, 2008
1,643
0
0
The_Logician19 said:
I've never understood the logic in this.
Such has always been the folly of Man. Such will be his doom.

also WarGames wasn't about computers taking over, it's about a computer who thinks there's a nuclear war starting.
 

Symp4thy

New member
Jan 7, 2009
660
0
0
Humanity will have blown itself off the face of this planet with nukes (or worse) before that is even a possibility.
 

000Ronald

New member
Mar 7, 2008
2,167
0
0
Response time, FTW.

Bored Tomatoe said:
I don't think that computers and machines will physically take over the world in evil genius fashion. I do believe that technology will render us dependent upon it, to the point where we cannot function without it. Teenagers today can't seem to live without social networking sites and cell phones, while working adults use their blackberries as a calendar, phone, toaster and back massager. It truly is scary how dependent we are on something that has been created so recently relatively speaking.
An interesting argument. I would say that they aren't dependant on them, at least not in the way you think; it's more they like having them around. Then again, I have been called optimistic.

RAKtheUndead said:
I'm actually writing a short vignette at the moment from the perspective of an autonomous robot. Called "The Automaton Speaks", it's pretty much an argument against machine uprising and against the fear of technology. Machines are not constructed with laziness, avarice or thoughtful malice, so there would be no gain for machines who had the objective of taking over the world. I see it as more likely that we develop a symbiotic relationship with technology, and far from it maliciously taking over the world, it will work in tandem with us.

If this thread is still active by the time that I complete it, I might post it here with permission from the OP. I leave you with a quote which I think sums up my opinions on this topic nicely:

"Rejection of technology ruins a good mind." - RAK
I'd like to read it first, but as a matter of principal, why not?

Qayin said:
Self awareness.

For example, us; we got self awareness, became aware that we were actually the best damn thing on the planet, and then made every other species our *****.

Give machines self-awareness, and they'd recongnise that, actually, they have the potential to be far better than us, and that we're not exactly doing a good job of preserving the world.

Add in crime, pollution, poverty, suffering etc, doesn't take a massive leap in logic. That said, what stupid fucker would give a machine self-awareness?
More to the point, how would one go about giving a machine self-awareness? I think it would have to do with speech; with speech, you have to recognize that 'I' is 'you'. And if there is a 'you', then you are.

Arsen said:
Too much though going into something that just exists for the sheer sake of entertainment.
Are you saying that The Internet, or rather, machines, exist solely for the sake of entertainment? Because I would adamantly disagree with that.

KneeLord said:
I may not have been the first to say it, but your poll is fucked up nonsense.

Your actual question though, isn't... WHY would a machine(s) WANT to take over the world. There's really only 2 ways about it:

1) It was programmed to want to take over the world. If it was following programming, why wouldn't matter, just accomplishing the what it is programmed to qualify as parameters for success.

2) To ask Why, if would have to have some capacity of self awareness. This is where the bamboo splits into lots of threads, and makes for the basis of lots of SciFi... I mean, why would a machine WANT anything? Would it subscribe to Maslow's scale of needs - skipping the obvious like food, shelter, sleep? If it was self aware, going by human logic, we assume that it would want to preserve it's existence - it's awareness. But artificial intelligence is not innately guided by any animal instinct to survive, so would it care? A sufficiently advanced artificial intelligence with the capacity for analytical reasoning would presumably come to the conclusion that it can either accept or reject the values and motivations of organic life, with a greater sovereignty over the decision that we ourselves possess (bound to oxidizing globs of meat, such that we are).

There's a lot of questions you could ask about artificial life / AI before even getting to ambitions of world domination. That said, lets assume A) It wishes to act to protect and continue its existence and B) It sees us as a threat, or as a competitor for resources it wishes to use to that end.

Again though, you have to think very abstractly to try and conceptualize the motivations of an advanced AI. If it found us to be a threat or competitor for resources, but knowing that it is immortal, if not invulnerable - might it make an escape run into space? An unmanned, scavenging/manufacturing/automated craft piloted by an intelligence capable of processing astrological amounts of information could theoretically explore space, sustaining itself and seeking out basic survival, or looking for other intelligences, with no time constraint, provided that time is linear and does not end.

Anyway, it's a cool idea to toss around. There's my 0.02
A thoughtful argument. I might argue who would program a computer to take over the world, but it would be a moot point. There's also the question how would a computer gain self awareness, and more importantly, why would that make it hostile (as you pointed out).

CountFenring said:
Because intelligent machines have a need for efficiency, this is just a guess on how machines would think, if they could.
Are you saying that humans don't have a need for efficiency? Or that machines do? What's more, why do machines need anything, besides regular maintenance and repair?

Wislong said:
Until they can think for themselves, no.

Define: "Think"
Exactly.

Twilight_guy said:
Computers are exceedingly stupid. They may give the illusion of being smart but from what I've seen they're just dumb. To program a computer to think would require a human being to be smart enough to know how thinking works. By definition that would require a human who is smarter then a human, which does not exist.

Also, Robots don't want to take over the world; they just get sick of being slaves to humans and fight back. Humans resist and war ensues.
That reminds me of something I told a friend once, regarding why I prefer stick-shift to automatic "A machine can only be as smart as the person who makes it, and most people I've met are either stupid or apathetic. Do you want someone stupid or apathetic driving you around?"

Also, how do you, or they (for that matter), know they're slaves?

Labyrinth said:
Presuming that you mean artificial intelligence manifested in robots, then I imagine it would be in order to obey a reproduction code. They need land to mine for resources to build more of themselves. Why? Because that's how they're written to be.

And if you don't, then they already have overtaken the world. Just think of what would happen if the internet entirely failed tomorrow.
Regarding your first point, why would a machine need to take care of itself if it knows we will take care of it? It's a symbiotic relationship, see?

Regarding your second point, I don't need to. [http://www.youtube.com/watch?v=lvpuT3aoypE]

Adam Jenson said:
I think you are all talking a load of old shit
Really? I thought I, at least was questioning that, poking the proverbial Big Brother in the eye, so to speak. THAT'S WHAT YOU GET FOR WATCHING ME IN THE SHOWER YOU SICK FUCK!

And so we end on a high note. Yeah.

Apologies Abound.
 

iseko

New member
Dec 4, 2008
727
0
0
All our technology is based on nature. If we know entirely how the human brain works you could say it is possible to make AI. But we would probably abuse it and have no respect for it, eventho it has a conciousnous. Which is kinda stupid since consciousnous has nothing to do with biology perse. Our body does not make us who we are. It is simply a tool we use to manipulate the physical world. You could say our brain is what makes us who we are but the brain is not much different than a cpu. All it does is add yes' and no's to come to a conclusion (yes and no <-> 1 and 0, see the resemblance?)

If we create something that is stronger and smarter then us then we are pretty much screwed because our intelligence is what keeps us at the top of the foodchain.

So I think it is possible but not now. Not in 50 years either. Brain is too complex.
 

742

New member
Sep 8, 2008
631
0
0
well a computer MIGHT try to take over the world, heres how that would go down. AI or no, heres how it would happen.
someone programs a computer to conquer(or in the case of AI, program it to WANT to conquer) earth.
computer attempts to conquer earth.
 

Combined

New member
Sep 13, 2008
1,625
0
0
They might, once they have a decent AI. Probably because they won't want us to steal their oil and electricity or something.
 

Unmannedperson

New member
Jul 16, 2008
115
0
0
In my opinion, theoretically computers already are in control of most of the developed world. Not in the standard "sentient" sense, but instead just by how key they are to our livelihood. Think about this. I am typing this right now on a computer, and you are reading it on yours. The info here must have gone through multiple gadgets before it went through, such as the ISP servers. Computers also keep track of such things as the world stock market, or most developed governments. There are also many other functions computers do that are less thanked such as onboard computers in your car or computers in a factory that made that car. Automation.

Long story short: Without computers, life as we know it would grind to a halt, so in a way, computers already have "taken over the world."
 

Katherine Kerensky

Why, or Why Not?
Mar 27, 2009
7,744
0
0
It's going to happen because we have created something perfect. something that cannot evole on it's own, being inanimate, but we have made it animate (mostly, of course it still depends on us... for now...)
I just think machines are better than people... it's not hard to take that view in this world of ours, when you see People but not Machines *End*ing other people

Besides, if robots were used in wars, they would *End*, not injure and leave people to suffer. They are much more precise, maybe even merciful in that way.
 

thiosk

New member
Sep 18, 2008
5,410
0
0
None of your poll options suggest reasons WHY the machines want to take over.

I would suggest they want to take over to control resources and spread their programming throughout the galaxy without being under the thumb of the pathetic carbon fleshbags any longer.