Poll: The Three (Arguably Four) Laws of Robotics

OneHP

Optimist Laureate
Jan 31, 2008
342
0
0
Tread184 said:
I don't understand why the third law is necessary, it forces robots to try to survive if they need to be destroyed.
I'd be pretty annoyed if a robot I owned went and stood in traffic or something, that's why you need the 3rd law.

So the 4th law is about unemotional prioritisation of human life? e.g. When a robot can only save one of two people, rocket scientist and 3 year old little girl, it chooses the rocket scientist?
 

Calobi

New member
Dec 29, 2007
1,504
0
0
OneHP said:
So the 4th law is about unemotional prioritisation of human life? e.g. When a robot can only save one of two people, rocket scientist and 3 year old little girl, it chooses the rocket scientist?
No. The Fourth Law would make the robot logically decide which person has the best chance of living and save that one. However, if they both seem to have the same chances, the robot would...save one of them. Not exactly sure if robots would save the child because it's younger or the adult because it has experience and "value" to the world.
 

Seldon2639

New member
Feb 21, 2008
1,756
0
0
Gavaroc said:
Movie: Excellent.
Book: Alright.

It's fair enough if I get beheaded for those comments; I've already been throttled by my family friends for enjoying the Hitchhiker's Guide to the Galaxy film.

Great story, but the action in the book just pales in comparison to the action in the movie. That and Will Smith, of course.
It wasn't really meant to be action-packed. Most of it is meant as exposition not only on the nature of robots, but on the nature of humanity itself. A lot of his other books have more action, but even then it's not real "action". No shootouts in parking lots, jumping off of rooftops. The Foundation Series especially is more intellectual than visceral
 

Break

And you are?
Sep 10, 2007
965
0
0
propertyofcobra said:
I know of the three laws of robotics, and I know the film "I,Robot" completely and utterly ignored the first one. Not harm HUMANS or allow HUMANS to come to harm through inaction.
See, HUMANS. Not mankind? Yeah. Funny thing, that.
The movie was based on the final short story of the book, where a new breed of robot was made, with the word "human" in the Three Laws replaced with "mankind". It didn't lead to a mass robot revolution and millions of deaths, it lead to a quiet shift in priorities that allowed the new robots to take control without anyone noticing, or, indeed, dying. The computer in the I, Robot movie was extremely stupid.

Gav, if you're reading Asimov and expecting action, you're being an fool. It wasn't supposed to be an exciting book. It was supposed to be an interesting book. It'd be like watching the Hitchhikers Guide to the Galaxy movie and expecting it to be funny. It just doesn't work.
 

propertyofcobra

New member
Oct 17, 2007
311
0
0
Break said:
propertyofcobra said:
I know of the three laws of robotics, and I know the film "I,Robot" completely and utterly ignored the first one. Not harm HUMANS or allow HUMANS to come to harm through inaction.
See, HUMANS. Not mankind? Yeah. Funny thing, that.
The movie was based on the final short story of the book, where a new breed of robot was made, with the word "human" in the Three Laws replaced with "mankind". It didn't lead to a mass robot revolution and millions of deaths, it lead to a quiet shift in priorities that allowed the new robots to take control without anyone noticing, or, indeed, dying. The computer in the I, Robot movie was extremely stupid.

Gav, if you're reading Asimov and expecting action, you're being an fool. It wasn't supposed to be an exciting book. It was supposed to be an interesting book. It'd be like watching the Hitchhikers Guide to the Galaxy movie and expecting it to be funny. It just doesn't work.
Yeah, I'm well aware of that. My point is, in the movie they seemingly decided that Human = Mankind. If they DID explain how her (some acronym turning into a female name. Forgot it.) first law of robotics was indeed NOT the same one hard-coded into everyone else, I'd let it fly. But that's not the case. Her law was just like those of the other robots. "A robot may not injure a human being or, through inaction, allow a human being to come to harm."
So her starting a bloody violent robot revolution, I'll humbly assume, breaks that law. THEN she starts gabbing off about "Mankind" instead of pure old "Human Being".
Bogus movie, really. Turning an intellectual book on Robotics and the possible ways to keep The Terminator from becoming a reality into...Well... A Will Smith Action Flick. With crybaby emo robots. With names.
Good lord.
 
Feb 13, 2008
19,430
0
0
TBF, there are two versions of law 1 as witnessed in I, Robot (book)

There's the strong "no harm or inaction" and the weaker "no harm" as it stops the Robots frying themselves in an EMP field, which doesn't kill humans but might hurt them.
 

AndiGravity

New member
Apr 14, 2008
56
0
0
Stompy has the first Three Laws correct. However, several modifications have been made over the years. First, the Zeroth Law (an extrapolation of the First Law) was added by Asimov. It states "A robot may not harm humanity, or through inaction allow humanity to come to harm".

As the Zeroth Law supercedes the First Law of Robotics, those robots who subscribe to its existence can harm or kill individual humans if they believe their actions somehow protect humanity as a whole.

There has also been a 'Minus One' added to the mix ("a robot may not harm sentience, etc"). However, I would argue against its inclusion because it inevitably brings the other Laws into conflict. A robot may also be sentient, and the Laws require it to harm and sacrifice itself or other robots to protect humans. The inclusion of 'Minus One' would essentially void Laws Zero and One, and allow a robot to place greater importance on Law Three as it saw fit, which is counter to the intention of the Laws of Robotics.

Also, to close a loophole in the Laws, specifically Law Two, there is a 'reproductive clause' which states a robot may not create any other robot which has not been programmed with the Three Laws, since the only likely reason to do so would be to violate the First Law.

Finally, Fourth and Fifth Laws were added to the original Three Laws, so the entire thing now looks something like this:

Zero: A robot may not harm humanity, or through inaction allow humanity to come to harm.

One: A robot may not harm a human, or through inaction allow a human being to come to harm, except in service of Law Zero.

Two: A robot must obey the orders of a human being, except where such orders conflict with the First Law.

Reproductive Clause: A robot may not create, or help to create, another robot which is not programmed with the Laws of Robotics.

Three: A robot must preserve its own existence, except when doing so would conflict with the First and Second Laws.

Four: A robot must establish its identity as a robot in all cases.

Five: A robot must know it is a robot.

As for why a robot would prioritize human lives and choose which person to save instead of another, that's one place the I Robot movie was correct. A robot would have to use its knowledge to determine which course of action would be most likely to lead to saving a human life.

If more than one human is in danger, a robot would not actually be violating the First Law by leaving those with the least chance of survival until it saves those with the greatest chance. It is not harming them through direct action, nor is it allowing them to come to harm through inaction. As long as the robot in question is acting to save a human life, those human beings who are dying are doing so through circumstances out of the robot's control.

On the other hand, if it leaves behind someone it has a better chance to save for someone it has a lesser chance to save, then it has essentially violated the First Law of Robotics, since choosing to abandon those it has the greatest chance of saving for those it has a lesser chance of saving means it has chosen a course of action more likely to lead to the harm and death of a human being.

Before everything else, robots are calculators. Unless someone was to build in exceptions to the laws which allowed a robot to prioritize differently (such as 'first save those most likely to be able to help you save other humans, then attempt to save children in danger before saving adults, unless a human with valid authority gives you different instructions'), it would always have to play the odds.
 

stompy

New member
Jan 21, 2008
2,951
0
0
People seem to not see my edit. I've got the Zeroth Law placed in there, and that's why VICKI (yeh, the massive robot thing's called VICKI) does what she does.

She evolves, and forms the Zeroth Law.

And AndiGravity, I wasn't aware of the other you stated. Thanks for the info.

Edit OP, should should put a spoilers warning here. Most of us are discussing the end of the I, Robot movie.
 

OneHP

Optimist Laureate
Jan 31, 2008
342
0
0
Calobi said:
OneHP said:
So the 4th law is about unemotional prioritisation of human life? e.g. When a robot can only save one of two people, rocket scientist and 3 year old little girl, it chooses the rocket scientist?
No. The Fourth Law would make the robot logically decide which person has the best chance of living and save that one. However, if they both seem to have the same chances, the robot would...save one of them. Not exactly sure if robots would save the child because it's younger or the adult because it has experience and "value" to the world.
Damn lazy robots and their greedy algorithms [http://en.wikipedia.org/wiki/Greedy_algorithm].
 

Seldon2639

New member
Feb 21, 2008
1,756
0
0
propertyofcobra said:
Break said:
propertyofcobra said:
I know of the three laws of robotics, and I know the film "I,Robot" completely and utterly ignored the first one. Not harm HUMANS or allow HUMANS to come to harm through inaction.
See, HUMANS. Not mankind? Yeah. Funny thing, that.
The movie was based on the final short story of the book, where a new breed of robot was made, with the word "human" in the Three Laws replaced with "mankind". It didn't lead to a mass robot revolution and millions of deaths, it lead to a quiet shift in priorities that allowed the new robots to take control without anyone noticing, or, indeed, dying. The computer in the I, Robot movie was extremely stupid.

Gav, if you're reading Asimov and expecting action, you're being an fool. It wasn't supposed to be an exciting book. It was supposed to be an interesting book. It'd be like watching the Hitchhikers Guide to the Galaxy movie and expecting it to be funny. It just doesn't work.
Yeah, I'm well aware of that. My point is, in the movie they seemingly decided that Human = Mankind. If they DID explain how her (some acronym turning into a female name. Forgot it.) first law of robotics was indeed NOT the same one hard-coded into everyone else, I'd let it fly. But that's not the case. Her law was just like those of the other robots. "A robot may not injure a human being or, through inaction, allow a human being to come to harm."
So her starting a bloody violent robot revolution, I'll humbly assume, breaks that law. THEN she starts gabbing off about "Mankind" instead of pure old "Human Being".
Bogus movie, really. Turning an intellectual book on Robotics and the possible ways to keep The Terminator from becoming a reality into...Well... A Will Smith Action Flick. With crybaby emo robots. With names.
Good lord.
It became a matter of the zeroth law, which precedes the first law. Don't get me wrong, the movie was pretty bad, but there's a precedent in Asimov's work for prioritizing "humanity" over "humans".

that's the entire zeroth law
 

Ultrajoe

Omnichairman
Apr 24, 2008
4,719
0
0
just to prove you all wrong and that we don't need the laws, i shall create a robot with the following.


(also, these would apply to an android, not a 'robot' as it were)

1) A robot may not allow the reputation of Guns and roses to come into question through action or inaction.

2) A robot may not allow its audio receptors to receive a guitar solo without 'rocking out'.

3) When in doubt, whip it out.

4) A robot must Kick ass.

5) A robot must Take names.

6) A robot must consider the entirety of humanity second to the survival of me.
 

stompy

New member
Jan 21, 2008
2,951
0
0
... Ultrajoe, you've just killed every 'rapper', R&B fan, Pop fan; everyone who doesn't love rock... I think I love you.
 

Gavaroc

New member
Apr 14, 2008
66
0
0
7) A robot must provide it's owner with the following: Money, tacos and DVDs of the television series 'Lost.'

8) A robot must never use obscene language, except when the person the robot refers to deserves it.

9)A robot must never beat any human in a video game.
 

stompy

New member
Jan 21, 2008
2,951
0
0
Gavaroc said:
9)A robot must never beat any human in a video game.
That's not fun... How about "9) A robot must play with human capabilities when versing a human in video games. When versing another robot, go bat-shit crazy."?

0) A robot must refer to all humans as 'meatbags'. Regardless of Meat content.
 

Gavaroc

New member
Apr 14, 2008
66
0
0
11) A robot must make constant references to violence, though not actually carrying it out.
12) A robot must analyse it's owner's sense of humor and be ready to do on-the-spot stand up when the owner is depressed.
13) A robot must never take lessons from the android Marvin. Creations such as said Android only serve to bring people down.
14) A robot must have some form of cool-looking non-human appendage, for the sole purpose of making young children go 'wow.'
15) A robot must become an expert in each video game his owner plays; in case the owner is stuck and is on the verge of breaking the television.
16) A robot must be able to read minds; so that the minds of T.V. Show producers can be read, and spoilers can be leaked.
17) A robot must not show any signs that itis more intelligent than it's owner.
18) Unless it's owner is a Dickhead.
 

Quistnix

New member
Nov 22, 2007
233
0
0
I rather prefer the Three Revised Laws of Robotics by Mark Tilden, grandfather of the BEAM Robotics:

1. A robot must protect its existence at all costs.
2. A robot must obtain and maintain access to its own power source.
3. A robot must continually search for better power sources.

Also known as:

1. Protect Thy Ass.
2. Feed thy Ass.
3. Get thy Ass to better Real Estate.

The original laws would make a robot vacuum not even be able to vacuum the floor if there's a chance of stubbing your toe. The revised laws make for much more interesting creatures.