MIT Researcher Proposes Rights for Robots

Marshall Honorof

New member
Feb 16, 2011
2,200
0
0
MIT Researcher Proposes Rights for Robots


Humans may find it beneficial to grant robots rights on par with pets.

We should all be pretty well aware at this point that the robot apocalypse (or "robopocalypse," if you will) is on its way. Our most gifted storytellers have been warning us about it for years, from the legend of the Skynet [http://en.wikipedia.org/wiki/Golem]. In their latest volley against an unsuspecting human race, our metallic overlords-to-be have conscripted MIT researcher Kate Darling to draft a new research paper that suggests humans grant rights to robots. According to Darling, robots don't need rights on par with humans (yet), but due to the emotional connections humans can create with them, we may find it beneficial to ascribe them similar rights to our pets.

All right, calling Darling a pawn of the robots may be going too far, considering that her 18-page paper lays out a digestible, cogent argument for robot rights sooner rather than later. "The typical debate surrounding 'rights for robots' assumes a futuristic world of fully autonomous and highly sophisticated androids that are nearly indistinguishable from humans," she writes. "While technological development may someday lead to such a Blade Runner-esque scenario, the future relevant legal issues are currently shrouded by unforeseeable factors." Darling describes the robots of today, from Sony's robotic dogs, to Paro the seal [http://en.wikipedia.org/wiki/Paro_(robot)] (who has seen proven success in geriatric therapeutics), and even Roomba vacuum cleaners, explaining that each one can generate a companionate, emotional reaction in humans, especially in small children. This interaction, she argues, is not the same as an interaction with nonresponsive toys. "While a child is aware of the projection onto an inanimate toy and can engage or not engage in it at will, a robot that demands attention by playing off of our natural responses may cause a subconscious engagement that is less voluntary."

Children are not the only subjects who anthropomorphize robots, either. Darling describes a situation in which a battle-hardened army colonel could not bear to watch a mine-detecting robot modeled after a stick insect get leg after leg blown off during a trial run. "[The] colonel just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg. This test, he charged, was inhumane," Darling quotes from a 2007 article by Joel Garreau. Even gamers should be familiar with granting human qualities to totally inanimate objects simply because they mimic a small human quality. "In the video game Portal ... which requires the player to incinerate the companion cube that has accompanied them throughout the game, some players will opt to sacrifice themselves rather than the object, forfeiting their victory."

The obvious criticism of Darling's paper is that simple companion robots, unlike animals, have no desires, feelings, or capacities for pain and pleasure. However, she reminds readers that "our desire to protect animals from harm may not necessarily be based on their inherent attributes, but rather on the projection of ourselves onto these animals." Furthermore, a discussion of robot rights in the present might make a similar discussion if - or when - robots develop sentience somewhere down the line. Just remember that protecting Paro the seal today might make it that much harder to fight back against the Terminator a few years from now.

Source: Computerworld [http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2044797]

Permalink
 

sethisjimmy

New member
May 22, 2009
601
0
0
Eh, Just because something is animate and we can feel emotion towards it doesn't mean it needs rights. Fire is animate and reacts to our stimuli and demands our attention, but it doesn't need rights.
Maybe that's a poor comparison. But I still think it's silly. Pets are actually alive, robots just mimic life. Just because we can get emotionally attached to something doesn't mean it's now an entity on the level of animal life.

Edit: I've received some fair criticism for this post, and I'll have to rethink my position. I think rights on the level of pets is a weird position to take however. That still makes them somewhat of a property or own-able object doesn't it? I guess the level of rights will change as the robot's level of sentience grows.
 

mooncalf

<Insert Avatar Here>
Jul 3, 2008
1,164
0
0
A robot sufficiently advanced as to be indistinguishable from a human might be imbued with equivalent rights, but I think before that point all that we can meaningfully concern ourselves with is how we view a person's care of possessions as proper or improper. I know there are folks out there who think it downright criminal to let a good thing go to waste. Is there a point where poor maintenance or deliberate misuse becomes a kind of vandalism?
 

disgruntledgamer

New member
Mar 6, 2012
905
0
0
Robots have the right to toast my bread and until they're close to making robot prostitutes with feeling (why you would want a robot prostitute with feeling is beyond me, the whole point of having a robot prostitute is she doesn't have feelings) Than robots should just shut up and get whats coming to them.

Ok joking aside this is absurd, at the sometime why don't we make intergalactic space travel rules and regulations, and what we can and cannot exploit from aliens from other planets not just the ones from Mexico. We got enough problem in the now to keep us occupied than worrying about problems in the distant future that may not even be an issue.
 

mad825

New member
Mar 28, 2010
3,379
0
0
Uhuh. I suppose the RSPCA will charge me on neglecting my Big Mouth Billy Bass and ban me from buying more. Robots are a long way off from needing "rights" and we would need to ensure that they don't dominate us first.

Somebody get that sappy woman a puppy or kitten so that then she can go and complain about animals rights with PETA or someone.
 

The Random One

New member
May 29, 2008
3,310
0
0
So is she saying the reason animals have rights is that we silly humans ascribe feelings and emotions to them that they don't actually have? I suppose stray dogs don't actually starve then?
 

iniudan

New member
Apr 27, 2011
538
0
0
sethisjimmy said:
Fire ... demands our attention
I think your had way too many of those "smokes", if a fire start to demand you anything or you happen to be next Moses.

------------------------------------------

I think the MIT is just preparing for it role, in post-apocalyptic world of Fallout, for after all that where Android shall come from. =p
 

gritch

Tastes like Science!
Feb 21, 2011
567
0
0
sethisjimmy said:
Pets are actually alive, robots just mimic life. Just because we can get emotionally attached to something doesn't mean it's now an entity on the level of animal life.
Actually I would argue it's our ability to project our own feelings and emotions onto another object/animal/being that defines our entire definition of ethnics and rights. Ethnics for me have always been derived from our own personal desires. I would not like to die, therefore I consider you killing me to be bad. I am also able to project my own thoughts onto other objects, animals, or people. The logic would go: I don't like to die so I bet that guy over there wouldn't like to die either. The degree of easiness to which I can empathize with the object determines the degree of "badness".

As a human I can easily empathize with another human, therefore I consider killing another human very bad. Most humans are also able to empathize with animals of higher intelligence (dogs, cats, etc) and those that share many common features with humans (chimps, apes). Most people would consider killing these animals to be bad, at least far worse than killing animals of lesser intelligence and alien features (such as insects). Often people can form strong familial bonds with such animals (pets as they're called).
Anything that people can empathize with strongly enough with incite an ethnical response and thus deserves rights. If people can empathize with non-living robots as well as animals, those robots deserve rights on par with animals. Claiming something deserves rights only because it's alive is nothing more than a rationalization.

Geez sorry about the long post. I guess I just find this topic very interesting.
 

The Rookie Gamer

New member
Mar 15, 2010
806
0
0
DVS BSTrD said:
Marshall Honorof said:
The obvious criticism of Darling's paper is that simple companion robots, unlike animals, have no desires, feelings, or capacities for pain and pleasure.
I still think they'd rather not be in another Will Smith movie.
They done the calculations, and have decided to curry favor with our inevitable robolords, the magnificent bastards Quislings.

In all honesty, if they start developing feelings, which I don't have any knowledge that they can, I'd be a tad bit more hesitant about killing them.

Catchpa: I am here

It's begun.
 

GenGenners

New member
Jul 25, 2012
344
0
0
The robots of today and the near future are not what sci-fi stories and hollywood would have you believe. There isn't enough difference (or lack thereof as the case may be) between a robot and the thing it's mimicking to justify any unique treatment.

Thanks to sci-fi, people tend to forget what a robot actually is.

"If x terrain scenario = y.safe, action [move left leg forward] by D=q/y.safe degrees"

^This example pretty much sums up what a robot AI actually is. An electronic checklist of possible inputs, responses and outcomes. The only reason a robot is considered 'advanced' is because the actual checklist in a real robot is a lot longer than the example mentioned above, and it goes through it all rather quickly.
 

Twilight_guy

Sight, Sound, and Mind
Nov 24, 2008
7,131
0
0
Yeah, come talk to me when robots have sentience, like all other being with rights. I'm not giving my toaster rights because somebody has developed an emotional connection with it. I live in a world where a dude married a fictional game character because he developed an emotional connection with it. People ate idiots and form emotional connections for stupid reasons. If its not self-aware and doesn't feel, it doesn't need rights.
 

MortisLegio

New member
Nov 5, 2008
1,258
0
0
I'm sorry, but no. Machines are tools and nothing more. Robots should have the same rights as a hammer. Why should machines have rights? Because we make them look human? Because they can talk? Despite looking human, machines are NOT alive. Machines are an imitation of life, but are not truly alive.
 

Xan Krieger

Completely insane
Feb 11, 2009
2,918
0
0
MortisLegio said:
I'm sorry, but no. Machines are tools and nothing more. Robots should have the same rights as a hammer. Why should machines have rights? Because we make them look human? Because they can talk? Despite looking human, machines are NOT alive. Machines are an imitation of life, but are not truly alive.
I was also about to make the hammer comparison (for voting records my hammer registered republican) but you beat me to it. A furby is a robot, a simple entertainment one. Should it ever have rights? Nope and if I see one in the voting line I'm kicking it like a football where I will go to prison for attempted murder (or murder if they can't fix it).
 

Joccaren

Elite Member
Mar 29, 2011
2,601
3
43
As much as I'll battle for Robot rights in other cases, not in this one.
It'd be an interesting discussion if it went onto computers like Watson, but nothing is really that advanced yet. Most machines are programmed to carry out a set set of instructions, and not necessarily think for themselves. When we create a robotic dog that doesn't just play animations, but actually has a will of its own where its processor calculates what it likes and doesn't like, then decides what it should do, and if it needs charging or something or not, and what it should do, then I'll fight for those things to be treated well as its nearing the level of intelligence many animals have.
When we hit sentient robots, if there are no rights by then they really need to be implemented, especially since we really can't claim we're any more 'alive' than a robot that is sentient and able to think for itself - even if by a set of algorithms [Which odds are is how our brain works].

For now though, no. Nothing is that advanced yet.
 

thepyrethatburns

New member
Sep 22, 2010
454
0
0
There was a comic strip years ago called Saturnalia that dealt with that AND had a link to a group that was trying to draft rights for robots. The group claimed that, in the 1700s, granting rights to animals would have seemed just as absurd as discussing the question now.

It's good to see that I was not the only one who read it.
 

surg3n

New member
May 16, 2011
709
0
0
Durrrrrr.... in the future, when androids are indistinguishable from humans, they should all have the same rights as we would afford our pets?

What rights do pets have?
The right to shit on the carpet as long as they show remorse afterward?
The right to hump legs?
The right to eat half a mouse and leave the other half for us to enjoy?

And how would we respect these rights if we can't even tell if they are human or fucking Dyson. Now I must go and have a coffee, from a machine, I might ask if it has any kids, I'll thank it, and might even tell it a mildly sexist joke, just to see if it's the future yet.
 

Therumancer

Citation Needed
Nov 28, 2007
9,909
0
0
Hmmm, well I've nver been of the opinion that Artificial Intelligence is inevitably going to lead to warefare and hate. I'm bit more of an Asimov-like optimist, and to to gravitate more towards science fiction and science fantasy that have humans and robots working together.

That said, there are a lot of issues at play with robotics that need to be considered, among them that Robots ultimatly need to be constructed, which is generally not going to be a cheap process, especially for sophisticated ones. If you subscribe full rights and independance to such a thing, who would create one, knowing there is going to be little benefit to it? A mining 'droid that can choose not to mine and say become a concert violinist could be a potential loss of millions of dollars.

Arguements about slavery and such tend to overlook that human slaves were not created outright by their masters, but evolved at the same time, and had independance taken away from them, where no such assumption existed from robots who actually were created with a purpose in mind.

Such are the things to consider, and truthfully I have no good answers to those problems, or at least none that satisfy me.

That said, the key element of most "AIs gone haywire" fiction tends to revolve around people freaking out when AIs start asking existential questions, and trying to basically kill them. I think the first step towards any real meaningful interaction with AI is going to be to expect this, and answer honestly.

I'll also say that there is also a tendency to try and project the worst elements of human nature onto other intelligences, like aliens, or robots, rather than the more positive ones that have lead to the creation of progressive societies. I for one don't nessicarly think that a robot is going to inherantly decide to destroy that which is weaker than itself simply because it's better. Just as I do not believe that contact between aliens and humans would nessicarly be hostile, I'd argue a society advanced enough to master intersteller travel and get from one end of the galaxy to another to meet another species quite probably grappled with all of the moral complexities and questions that we have, and come to the realization that say wiping out every less advances species and stealing their stuff isn't exactly right. There is no reason to believe that it's a unique perspective to us, to at least want to be the good guys, and see space exploration and discovery through the eyes of something like The Federation from "Star Trek" as they develop... which is to say cautious at times, but not inherantly hostile towards everything we run into.

At any rate, I'm rambling, that said I don't think "robotics" are advanced enough to make a distinction between a robot and a device. I can understand the arguement for laying a groundwork, but I think it's a little too early in the process. That's not an excuse so much as a belief that if we jump the gun in inserting too much morality too early, we're never likely to get the point where such moral pondering becomes relevent. If you start seeing current robots as pets, you'd arguably being saying that you can't tinker with your vacuum cleaner or whatever without following animal testing guidelines or whatever, and that's
just insane. I imagine repair centers would love to argue you can't fix a clog on your own and need to bring the self propelled vaccuum into them for the equivilent of accredited veterenary surgery. :)
 

Old Father Eternity

New member
Aug 6, 2010
481
0
0
So ... as on Aurora, or eventually as described by Roj Nemmenuh Sarton. The latter would not be bad really. In fact, Asimov's works hold in my opinion quite a bit useful ideas.


Also, to end any form of life(sentience would be a pretty big requisite), without explicit need to do so, is a meaningless waste.