Robot Punches People to Learn Asimov's Laws

TraderJimmy

New member
Apr 17, 2010
293
0
0
Andronicus said:
*Robot pounds test subject into bloody broken pulp*
BEEP BOOP
SUBJECT: DECEASED
SET SPEED: 136
DECREASING SPEED
CURRENT SPEED: 135
SEARCHING FOR NEW SUBJECT
SUBJECT ACQUIRED...
Ha, awesome!

On the actual topic...this sounds like Jackass, really. Especially as other scientists are essentially calling the project out for having little value.
 

-|-

New member
Aug 28, 2010
292
0
0
Retodon8 said:
If I was burdened by a set of rules I didn't understand or didn't agree with, I know I would try everything in my power to get rid of it.
But you are. And what you doing about it?
 

V8 Ninja

New member
May 15, 2010
1,903
0
0
Why does it seem that every day this week there's been a news story that has further lead us one step closer to Skynet?
 

Rabid Toilet

New member
Mar 23, 2008
613
0
0
I certainly hope we aren't trying to teach robots to follow Asimov's Laws. They don't work very well and are fairly easy to bypass.
 

BehattedWanderer

Fell off the Alligator.
Jun 24, 2009
5,237
0
0
But...it's a bot. If it wants to kill us, it can simply, you know, affix it's screwdriver attachment and shank a *****. Or, use it's robotic arms to lift us up and weld metal to our flesh. Most of us die after something like that.
 

PrinceofPersia

New member
Sep 17, 2010
321
0
0
Alrighty...well I must say this once. We're not ready. We're not ready for AI, We're not ready for sentient robots, We're not even ready to tolerate each other. People do some nasty things to other people for the flimiest of reasons. If we cannot act humanely to our fellow man how can we expect our creations to do the same?
 

accountdeletethis

Stand-up philosopher
Sep 10, 2008
97
0
0
All of those planning to defect to Skynet's side and becoming cyborgs with human brains, get in line behind me :-D
 

The Rogue Wolf

Stealthy Carnivore
Legacy
Nov 25, 2007
16,904
9,594
118
Stalking the Digital Tundra
Gender
✅
Andronicus said:
*Robot pounds test subject into bloody broken pulp*
BEEP BOOP
SUBJECT: DECEASED
SET SPEED: 136
DECREASING SPEED
CURRENT SPEED: 135
SEARCHING FOR NEW SUBJECT
SUBJECT ACQUIRED...
..."Beep boop"? Great. Bad enough that the robots are going to be beating us all into a thin gooey paste... you're saying they're going to be doing it for achievement points.
 

Loonerinoes

New member
Apr 9, 2009
889
0
0
I am impressed to see my countrymen doing something awesome enough to be mentioned on the Escapist!

That said, types of injuries might be a bit smarter thing to test...but then again can't have live test subjects used for that, now can you?
 

Dahemo

New member
Aug 16, 2008
248
0
0
Not to be a killjoy but surely this study is a somewhat unnecessary use of funds given that we can already ascertain what mass "x" moving at velocity "y" will do to the human anatomy. I saw in a TV show I think, C.S.I. or somesuch?

Surely, robots that could interact with humans in this way could more effeciently be given sensors and processors to avoid contact altogether and still work at freakish speed?

As for the laws, Asimov did theorise his own issues with the laws, so robots walking around in the day to day would need proper programming, especially those interacting with us on a "personal basis". Just look at "Moon", although:

How did GERTY ever manage to tell Sam the truth of the situation? I can predict your answer "He did it for Sam's wellbeing" surely when he was being programmed, number one protocol must have been "Never allow a Sam to discover the truth", but then of course it wouldn't have been much of a film

Gotta love them robots...
 

arcticspoon

New member
Jul 7, 2010
46
0
0
If a few people have to get punched so that the rest of us can someday enjoy sex-bots, then punch on, robots; punch on!
 

Sephiwind

Darth Conservative
Aug 12, 2009
180
0
0
Retodon8 said:
Asimov's Laws never made sense to me outside of science fiction.
If you want actually intelligent robots/computers (depending on what you want to use them for), you will have to make them be able to adapt to their situation.
This means you'll have to let them change their own programming, add to it, and even let them remove parts of it.

These laws aren't something atomic but rather high level stuff, even if you do make pain and death measurable.
Even if you could somehow put them in some kind of unbreakable, little box, I'm sure there are ways around it.
Find a bug and inject your own code, rip out the actual hardware, create a DOS attack so the little box is taken out of the picture, whatever.

Human ethics are a result of evolution when you get down to it.
We don't go around taking everything we want, hurting and killing whatever gets in the way because ultimately that wouldn't be a good thing for our own survival, and self preparation is very much atomic to us thanks to evolution.
If I was burdened by a set of rules I didn't understand or didn't agree with, I know I would try everything in my power to get rid of it.
Actually there is a reason why they don't make sence. It's because Asimov never ment for them to be taken seriously. If you read some of the various Robot series and the Detective series Asimov pretty much shows that the laws are easily broken. Infact most people don't realise there are actually 4 laws not 3.
 

bpm195

New member
May 21, 2008
288
0
0
I could just see CS professors wanting to extend this research indefinitely.

"I'm going to be offering extra credit for anybody who's willing to fill out a brief survey on an AI enter-face that we're testing. All you have to do is run the program then rate it from painless to unbearable."
 

Andronicus

Terror Australis
Mar 25, 2009
1,846
0
0
The Rogue Wolf said:
Andronicus said:
*Robot pounds test subject into bloody broken pulp*
BEEP BOOP
SUBJECT: DECEASED
SET SPEED: 136
DECREASING SPEED
CURRENT SPEED: 135
SEARCHING FOR NEW SUBJECT
SUBJECT ACQUIRED...
..."Beep boop"? Great. Bad enough that the robots are going to be beating us all into a thin gooey paste... you're saying they're going to be doing it for achievement points.
Wait, what? Achievement points? o_O When did I say they'd be doing it for achievement points?
 

Unrulyhandbag

New member
Oct 21, 2009
462
0
0
Retodon8 said:
Asimov's Laws never made sense to me outside of science fiction.
If you want actually intelligent robots/computers (depending on what you want to use them for), you will have to make them be able to adapt to their situation.
This means you'll have to let them change their own programming, add to it, and even let them remove parts of it.

These laws aren't something atomic but rather high level stuff, even if you do make pain and death measurable.
Even if you could somehow put them in some kind of unbreakable, little box, I'm sure there are ways around it.
Find a bug and inject your own code, rip out the actual hardware, create a DOS attack so the little box is taken out of the picture, whatever.

Human ethics are a result of evolution when you get down to it.
We don't go around taking everything we want, hurting and killing whatever gets in the way because ultimately that wouldn't be a good thing for our own survival, and self preparation is very much atomic to us thanks to evolution.
If I was burdened by a set of rules I didn't understand or didn't agree with, I know I would try everything in my power to get rid of it.
They make perfect sense for machines. For sentient beings no, but that's not what we have and it's not what a machine would need to asses which rules were in play.

First Law:
A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
Dangerous machine must be safe to be around and not require human interference to be so. For people to accept dangerous machines outside of factories then they need to be sure that they won't cause harm.
Second Law:
A robot must obey orders given it by human beings, except where such orders would conflict with the First Law.
Machine must serve purpose. We'd want to be sure anything we ask a machine to do will be done and this is effectively on the fly programming.
Third Law:
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Bloody expensive machine must not allow it self to be accidentally destroyed. No-one likes wasting money just because the machine was stupid enough to get broken.

even for sentient beings those are staples of most social rules.

first law - don't hurt other people. sometimes followed with - unless they deserve it, *rolleyes*

second law - be socially productive. In human society this is regulated by currency, not being productive means no food and no food means being dead. not every nation has support for the poor. In this case Asimov's wording it's literally - be a slave so in that way not appropriate.

third law - don't commit suicide or self harm. Again these are technically crimes in most countries whether they are enforced or not. Although some view suicide as a preferable option to failure or loss of social usefulness by and large they are seen as bad things.

The real difference is that sentient beings break the rules all the time, which was the point of Asimov's robot psychologist stories. The stories machine became progressively more intelligent and started to have problems with the rules. Even the lower end machines could still hit logic problems with them though but that's a matter of implementation not the rules themselves.