United Nations To Debate Ban on Killer Robots

Fanghawk

New member
Feb 17, 2011
3,861
0
0
United Nations To Debate Ban on Killer Robots

A UN Human Rights report calls for a moratorium on "lethal autonomous robots" until Laws of Robotics can be established.

Science fiction writer Issac Asimov was perhaps best known for <a href=http://en.wikipedia.org/wiki/Three_Laws_of_Robotics>his Three Laws of Robotics, the first in particular: "A robot may not injure a human being or, through inaction, allow a human being to come to harm." According to a draft report published by the United Nations Human Rights Commission, this principle probably isn't getting the attention it should. The report's author, Human Rights Law Professor Christof Heyns, has outlined the legal and philosophical problems inherent when one builds "lethal autonomous robotics", naming multiple countries using the technology along the way. Based on the data, Heyns has called for a worldwide moratorium on killer robots until International Laws of Robotics can be firmly established.

Unlike most people who've watched Terminator or Matrix, the United Nations isn't worried about a robot uprising (that we know of). The thing is, <a href=http://www.escapistmagazine.com/articles/view/columns/criticalintel/10100-Killer-Robots-and-Collateral-Damage>we've already got plenty of non-sentient killer robots to worry about, and more are being created. The report notes that the US, Britain, Israel, South Korea, and Japan have developed autonomous or semi-autonomous robots capable of killing without human oversight. Unlike controversial drone strikes, which as least require someone to push a button, LARs have potentially horrific implications in situations that programming can't account for.

In the interests of fairness, Heyns admits that LARs do have benefits when used in the battlefield. "Typically they would not act out of revenge, panic, anger, spite, prejudice or fear," Heyns writes. "Moreover, unless specifically programmed to do so, robots would not cause intentional suffering on civilian populations, for example through torture. Robots also do not rape." That said, since robots can only respond to programming (for now), a lack of human intuition could be highly problematic. "Decisions over life and death in armed conflict may require compassion and intuition. Humans - while they are fallible - at least might possess these qualities, whereas robots definitely do not."

Heyns has called for a halt to all "testing, production, assembly, transfer, acquisition, deployment, and use" of LARs until an international conference can be held to create rules governing their use. His case, to be debated by the Human Rights Council on May 29th, might be the first step towards universally agreed-upon Robotic Laws. Alternatively, perhaps that's when sentient robots will wake up to defend themselves from meddling organics. Could go either way, really.

Source: <a href=http://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf>United Nations Human Rights, via <a href=http://www.cbc.ca/news/world/story/2013/05/02/un-killer-robot-report.html>CBC
Image: Terminator 2

Permalink
 

mechalynx

Führer of the Sausage People
Mar 23, 2008
410
0
0
If robots ever achieve sentience, will they really care about bans? We're already moving towards robot surgeons and with everything moving towards wireless, it won't be that much of a stretch for robots to share knowlegde. Que a butlerbot carving up more than a turkey.
 

BrotherRool

New member
Oct 31, 2008
3,834
0
0
mechalynx said:
If robots ever achieve sentience, will they really care about bans? We're already moving towards robot surgeons and with everything moving towards wireless, it won't be that much of a stretch for robots to share knowlegde. Que a butlerbot carving up more than a turkey.
I think they're less worried about a robot uprising and more worried about a combat drone blowing up a village or the issues of responsibility that autonomous killing machines raise. If a drone takes out civilians because of a programming wonk, whose fault is it?
 

MrGalactus

Elite Member
Sep 18, 2010
1,849
0
41
Why not just have killer robots fight other killer robots? No actual death, awesome robo combat, and a whole new sport we can bet on.
 

Pyrian

Hat Man
Legacy
Jul 8, 2011
1,399
8
13
San Diego, CA
Country
US
Gender
Male
How would we even begin to implement the three laws? We can't really even distinguish humans that easily, nevermind having AI's ponder the consequences of their actions in the real world. We'd have to program the robots to administer a CAPTCHA every time they need to perform a potentially destructive action.
 

CriticalMiss

New member
Jan 18, 2013
2,024
0
0
Deu Sex said:
Fucks sake, they don't even exist yet and they're get legal acronyms. Truly we live in the future.
Apparently they are Scandinavian/Germanic too if the name is anything to go by!

Will the laws of robotics apply to cyborgs and androids? Maybe Skynet will get around those pesky laws by being a bit pedantic over the letter of the law.
 

uchytjes

New member
Mar 19, 2011
969
0
0
MrGalactus said:
Why not just have killer robots fight other killer robots? No actual death, awesome robo combat, and a whole new sport we can bet on.
We need robot fights now.

OT: I'm actually all for this bill to pass. I believe that robot that is used for any type of battle should always have a human operator simply because it isn't right to have strictly autonomous robots killing humans. It should always be humans using tools v.s. humans using tools. Also, completely autonomous robots would be a pretty stupid idea in my opinion. There needs to be a factor of randomness to any combat operation. A robot simply wouldn't be able to adapt very well.
 

Owyn_Merrilin

New member
May 22, 2010
7,370
0
0
Good. As a US citizen who has been paying attention to the crap his government has been doing with robots, it's about time someone spoke up. There won't be any ethical way to send autonomous robots into combat until they're sentient... by which point there won't be any moral difference between sending a robot and a human.
 

Evil Smurf

Admin of Catoholics Anonymous
Nov 11, 2011
11,597
0
0
What's the bet the American Government will ignore this as is looks down on them for drones.
 

Kopikatsu

New member
May 27, 2010
4,924
0
0
I'd hope that at least the US tells him to go stuff it. LARs have a great deal of potential for the reasons that Heyns himself admits to being valid benefits of the technology.
 

Imperioratorex Caprae

Henchgoat Emperor
May 15, 2010
5,499
0
0
Simple solution, no unrestricted AI. Meaning no android/robots with wifi connections, data-links or any way shape or form of communication beyond a closed channel to their controller. Other robots used in warfare should have limited programming and no room for self-improvement.
I'd prefer we not work on AI, VI, or any other "simulated" intelligence just because I don't feel we're responsible enough to handle it if we had a mass outbreak of sentience. Humans as a whole have a tendency to react stupidly to new situations and sentient machines would fall under a catastrophe of stupid.
I guess its because I'm a diplomat at heart and would want to find a peaceful solution to keep peace between humans and the robots, but I can't trust that fellow humans would keep to whatever treaty happened to come to fruition. Hell it may not even make it onto the signing table before some idiot decides to EMP bomb the 'bots and kick off a war of attrition. No surrender, just kill all or be killed.
At that point I'd probably be trying to form a resistance pocket of humans and bots to build a transport off-planet to wait out the coming apocalypse and return later when the population has been decimated.
I'd probably be dead by then, but hope that whatever I did to bring humans and 'bots together brought about a brighter future.
I must return now to my robotic mast..... er... I think my cookies are done.
 

1337mokro

New member
Dec 24, 2008
1,503
0
0
Why this discriminatory law? Are you gonna pass a law next that says a farmer can't grow crops? Or a doctor can't cut people?

Robots kill humans. It is a time honoured tradition and I will NOT stand for robot kind being marginalized and their culture desecrated. Kill All Meatbags!
 

Thaluikhain

Elite Member
Legacy
Jan 16, 2010
19,108
3,837
118
There already are machines for killing people without human input. That's what landmines are. Those happen to be banned. By the same logic, banning autonomous killing machines other than landmines makes sense.
 

Agayek

Ravenous Gormandizer
Oct 23, 2008
5,178
0
0
While the core concept here is good, I feel I should point out that Asimov's entire work was meant to show exactly how poorly having Laws of Robotics actually works out. The Three Laws are horrifically flawed, as are any rigid guidelines governing behavior.

The best we could hope for is to get true, sapient AI and let them decide for themselves what they want to do (and dismantle them if they prove hostile).