MIT Researcher Proposes Rights for Robots

Moonlight Butterfly

Be the Leaf
Mar 16, 2011
6,157
0
0
I think it's a good idea to get something in place before we create any sort of robotic AI but that is a long long time coming :|
 

Adam Jensen_v1legacy

I never asked for this
Sep 8, 2011
6,651
0
0
I have a crazy idea. Don't make a true AI and you won't have to worry about the ethical and legal ramifications of robot rights.
 

Paradoxrifts

New member
Jan 17, 2010
917
0
0
I got this brilliant mental image of the battle hardened colonel with manly tears streaming down his face as the bomb detecting preying mantis robot takes one for the team. :p
 

jbm1986

New member
May 18, 2012
199
0
0
Quaxar said:
FREE THE ROOMBA!

On an unrelated note... I wonder if she's related to former head of MI6 Darling. Probably not. I don't know why I even brought that up...
[HEADING=1]WHY DID WE GIVE THEM GUNS!?[/HEADING]
 

Woodsey

New member
Aug 9, 2009
14,553
0
0
It's ever-so-slightly premature, although I would have thought that if the eventual conclusion is that if robots get to the point of using genuine AI, then we'd have to give them rights closer to that of humans than animals.
 

Scrustle

New member
Apr 30, 2011
2,031
0
0
Sorry, but that's bull shit. Just because we project emotions on to things doesn't mean they have them or deserve to be treated as if they do.

We may project on to some animals, but we know that other animals do in fact have feelings. The law reflects this. No-one is going to prosecute you for torturing an insect or sticking a sharp hook through the face of a fish, but you will get punishment for abusing something like a dog or an ape. These animals do have feelings and thoughts, we don't just project them. When they are abused they show obvious signs of emotional response and even mental damage. A fish, however, may feel fear but little beyond that. We don't suddenly say that fish need rights because we feel sympathy for Nemo's father in Finding Nemo.

When robots get more advanced perhaps they will deserve rights, but we are a very long way off from that and merely our own projection is not a reason for us to give them rights.
 

Diddy_Mao

New member
Jan 14, 2009
1,189
0
0
Wondering how many people actually read the paper, read the article about the paper...or just saw the title and decided to post.

"My toaster doesn't need to vote...Hurrrr!"

The sheer amount of knee jerk technofear shown here is quite frankly one of the better cases for promoting this type of incremental advancement. As a species/society we simply aren't going to be willing to share our "god given rights" (their words,not mine) and it's going to be easier to make small changes as necessity demands rather than ignoring the issue outright until it's too late.
 

samahain

New member
Sep 23, 2010
78
0
0
"Animal Rights" and "Children's Rights" are forged by, well... Human grownups (!) but A.I. is by definition able to communicate.

It's gonna get real when the robots speak their piece.
 

Zipa

batlh bIHeghjaj.
Dec 19, 2010
1,489
0
0
It seems a bit premature, robots can't think (yet) but when there are sentient self aware robots (aka like Data from Star Trek ) then yeah they deserve all the rights of a self aware being.

We should focus on equal rights for humans first though I think, maybe when we get that right we can move on to robots.
 

Coffinshaker

New member
Feb 16, 2011
208
0
0
not sure how this is really news... many people, including myself, have made public articles and discussions about this very subject in great detail. nothing really new, but hopefully it's something we will continue to look into, especially as computers and robotics start getting closer to emulating human/living characteristics.
 

Aeshi

New member
Dec 22, 2009
2,640
0
0
So where are our rights for Viruses[footnote]Biological ones, though I suppose this could apply to digital ones as well.[/footnote] then? Since we're apparently giving rights to things that are barely self-aware (maybe not even that) and only react due to Stimuli rather than due to any actual intellect. And I'd say Viruses need rights far more since unlike robots they're the victims of a near-constant genocide.

She is right that people tend to get emotionally attached to robots though. Anyone heard of Sergeant Talon?
 

PinkiePyro

New member
Sep 26, 2010
1,121
0
0
Personally I say it could not hurt to have a few basic rights for robots they may not be advanced enough to care yet but it could help smooth over any issues later on if they do indeed gain human level intelligence (or if alien robots that may or may not look like earth vehicles show up)
 

Winthrop

New member
Apr 7, 2010
325
0
0
gritch said:
sethisjimmy said:
Pets are actually alive, robots just mimic life. Just because we can get emotionally attached to something doesn't mean it's now an entity on the level of animal life.
Actually I would argue it's our ability to project our own feelings and emotions onto another object/animal/being that defines our entire definition of ethnics and rights. Ethnics for me have always been derived from our own personal desires. I would not like to die, therefore I consider you killing me to be bad. I am also able to project my own thoughts onto other objects, animals, or people. The logic would go: I don't like to die so I bet that guy over there wouldn't like to die either. The degree of easiness to which I can empathize with the object determines the degree of "badness".

As a human I can easily empathize with another human, therefore I consider killing another human very bad. Most humans are also able to empathize with animals of higher intelligence (dogs, cats, etc) and those that share many common features with humans (chimps, apes). Most people would consider killing these animals to be bad, at least far worse than killing animals of lesser intelligence and alien features (such as insects). Often people can form strong familial bonds with such animals (pets as they're called).
Anything that people can empathize with strongly enough with incite an ethnical response and thus deserves rights. If people can empathize with non-living robots as well as animals, those robots deserve rights on par with animals. Claiming something deserves rights only because it's alive is nothing more than a rationalization.

Geez sorry about the long post. I guess I just find this topic very interesting.
You actually really made me think with this. I came in here wanting to say it was stupid but now I am not sure I believe that. Somebody actually convinced somebody of something online. Thats crazy. Anyway I'm going to ramble for a while. What defines morality, ethics and what should give rights? Is its intelligence? If so shouldn't hunting and the entire meat industry be shut down? But I do not feel that way personally, yet I would feel appalled at someone killing a dog for no reason. Perhaps then it is the projection of myself onto a relatable object or organism as the person I quoted has said. If so then you are correct that ethics should be considered for nonliving things. But what would the repercussions be? Videogaming may become inhumane in the eyes of the law. Machinery would not be able to be used in factories, and hitting your computer when it isn't working would be considered illegal. Obviously these ideas are ridiculous. Perhaps then a set of right should be created that isn't that of a human or of a bet, but of something completely different.
 

gritch

Tastes like Science!
Feb 21, 2011
567
0
0
Winthrop said:
You actually really made me think with this. I came in here wanting to say it was stupid but now I am not sure I believe that. Somebody actually convinced somebody of something online. Thats crazy. Anyway I'm going to ramble for a while. What defines morality, ethics and what should give rights? Is its intelligence? If so shouldn't hunting and the entire meat industry be shut down? But I do not feel that way personally, yet I would feel appalled at someone killing a dog for no reason. Perhaps then it is the projection of myself onto a relatable object or organism as the person I quoted has said. If so then you are correct that ethics should be considered for nonliving things. But what would the repercussions be? Videogaming may become inhumane in the eyes of the law. Machinery would not be able to be used in factories, and hitting your computer when it isn't working would be considered illegal. Obviously these ideas are ridiculous. Perhaps then a set of right should be created that isn't that of a human or of a bet, but of something completely different.
Glad I got you thinking about it. I guess some people actually DO read my posts!
You bring up some interesting scenario but I believe with some careful considerations we can eliminate some confusion. While it's true an individual might be able to feel a strong affection toward a completely inanimate object (such as naming a car as such) they don't actually seem to be projecting their emotions onto that object, rather they're considering that object as a possession of their's. Harming or stealing another's possession is indirectly harming the person in possession of said object. These objects do not deserve rights themselves, but harming them indirectly harms a person instead. Harming the object itself isn't bad, but harming that person's object is bad.

To address your issue with video games, we have to make a distinction between imaginary and real constructs. When someone reads a book or plays a videogame, one is conscious of the fact that characters and plots within are not real, no matter the degree of emotional attachment they incite. We can always flip back a few pages or restart the chapter and these characters are back to where they were. For imaginary characters recorded in media there is a finite amount of actions they can/will perform. Most people are able to realize this and to a degree disassociate their own emotions. Killing off your most beloved character might cause anger or frustration towards the writer but most people would never think the writer morally corrupt for it.

Also machines that don't remotely represent humans/animals (such as in your examples of factories or computers)aren't likely to cause emotion responses with people. You'll probably never seen yourself jailed for harming your own computer.

And this post is even longer than the first. Oh well. I could talk about this topic all day.