The Moral Dilemma of Self-Driving Cars - Save the Driver Or Others

John Keefer

Devilish Rogue
Aug 12, 2013
630
0
0
The Moral Dilemma of Self-Driving Cars - Save the Driver Or Others



With computers in cars becoming more intelligent, it will only be a matter of time before self-driving vehicles become faced with an impossible situation, and one that requires proper programming.

Car drivers are slowly losing various aspects of control as vehicle technology marches forward. There are cars that can parallel park themselves, set cruise control on their own and even pass other vehicles without the driver lifting a finger.

With this increase in automation, scientists are already looking into how to program smart vehicles [http://www.technologyreview.com/news/539731/how-to-help-self-driving-cars-make-ethical-decisions/] if they ever become with an impossible, almost no-win situation. And a key to that programming will be public perception and reaction, something that has been unknown until a recent study from Jean-Francois Bonnefon of the Toulouse School of Economics in France.

Bonnefon and associates decided to tackle public opinion and draw some conclusions based on that research. Using Amazon's Mechanical Turk [https://www.mturk.com/mturk/welcome] to get input from several hundred individuals, the group posed several scenarios, including the potential for the driver to be killed, but many others saved, while also adding in variables such as the age of the driver or potential victims, whether children were in the car, and even if the driver was the person responding to the questions.

The results were somewhat predictable: As long as the driver was not themselves, responders were in favor of progrmming smart cars in a way that would minimize the potential loss of life.

"[Participants] were not as confident that autonomous vehicles would be programmed that way in reality-and for a good reason: they actually wished others to cruise in utilitarian autonomous vehicles, more than they wanted to buy utilitarian autonomous vehicles themselves," the study concludes.

But then, that result opened up even more questions: "Is it acceptable for an autonomous vehicle to avoid a motorcycle by swerving into a wall, considering that the probability of survival is greater for the passenger of the car, than for the rider of the motorcycle? Should different decisions be made when children are on board, since they both have a longer time ahead of them than adults, and had less agency in being in the car in the first place? If a manufacturer offers different versions of its moral algorithm, and a buyer knowingly chose one of them, is the buyer to blame for the harmful consequences of the algorithm's decisions?"

In the end, the study says that these questions need to be addressed and algorithm form sooner rather than later as smart cars become more and more prevalent.

Source: MIT Technology Review [http://arxiv.org/abs/1510.03346]

Permalink
 

Kross

World Breaker
Sep 27, 2004
854
0
0
There's no way I'm ever (voluntarily) driving in a machine that prioritizes killing me over anyone else.

There's also no way I'd ever want to use an automated vehicle unless every other vehicle on the road is also automated, as that swarm coordination is where the true benefit resides.
 

lancar

New member
Aug 11, 2009
428
0
0
I don't even have a car, but to me the answer is simple: save the one who is not in control (ie, the driver).

I would NEVER buy a vehicle that would willingly sacrifice me to save others without even asking me first. And since there's no time for an autonomous car to ask me for my opinion in the matter once an accident is imminent, I'll have to go with the computer that thinks I'm more important than whoever decided to step out in front of me.
 

Space Jawa

New member
Feb 2, 2010
551
0
0
I'll go one further and say I have no intention of ever buying a self-driving car at all.
 

MonsterCrit

New member
Feb 17, 2015
594
0
0
Cath 22. It's the sort of thing that people make all the time in real life. But the thing is, when a human does it there's not much thought in theaction. It's reflexive twitch. FOr a computer to be programmed hoiwever that means someone put a lot of deliberate thought into it and that means someone decided inderectly that someone should live and someone else should die.
 

bladestorm91

New member
Mar 18, 2015
49
0
0
Here's the thing. If a self-driving car is following all the rules and it comes to the situation mentioned in that research, then the car and driver are not at fault, but the ones who are crossing the road.

That makes things simpler, the people crossing the road can go to hell. You and the car who followed the rules should be saved no matter what. This would be true even if that wasn't the case because no one would buy a car that would kill you.
 

Pyrian

Hat Man
Legacy
Jul 8, 2011
1,399
8
13
San Diego, CA
Country
US
Gender
Male
Most of these things ascribe a level of awareness that the cars won't really have. Is that a person crossing the street, or a cardboard cutout blown by the wind?
 

MTGorilla

New member
Sep 4, 2011
1
0
0
Pyrian said:
Most of these things ascribe a level of awareness that the cars won't really have. Is that a person crossing the street, or a cardboard cutout blown by the wind?
In poor visibility conditions, how would a human fare any better? The autonomous car has the advantage of infrared sensing equipment, which can easily tell the difference between a real living thing and a piece of cardboard blowing in the wind.

I feel that autonomous vehicles should prioritize the safety of the occupants in any potentially hazardous situation, but this does require a lot of forethought. Infrastructure needs to be re-examined, and the educations of pedestrians is a must; it's one thing to have the vehicle want to keep the occupants safe, and another to have a crazed Terminator-mobile that causally mows down pedestrians.
 

Silverbane7

New member
Jul 1, 2012
132
0
0
have to agree with the facebooker on the main page.... just build underground (and overground) crosswalks and then no one gets hurt (at least in the situation of people crossing the road where they should not be)

however, i also have to agree that it is better to look at the problem from the engineering side (and make sure that, should your jonnycab decide that you are less important in a crash situation than that crowd of walkers suddenly thrust out infront of it,) make sure that even if it DOES crash into a wall or take out a railing, the passengers will be saved. even if they have to be covered in crash foam and then dug out.
tho we would need the foam for that....

stop playing with machines trying to make them choose which human to save, and start working on the materials that mean you dont *have* to program them to choose who to save.
 

ritchards

Non-gamer in a gaming world
Nov 20, 2009
641
0
0
John Keefer said:
There are cars that can parallel park themselves, set cruise control on their own and even pass other vehicles without the driver lifting a finger.
Oh, they may lift a finger. Possibly even two. ;)
 

hermes

New member
Mar 2, 2009
3,865
0
0
bladestorm91 said:
Here's the thing. If a self-driving car is following all the rules and it comes to the situation mentioned in that research, then the car and driver are not at fault, but the ones who are crossing the road.

That makes things simpler, the people crossing the road can go to hell. You and the car who followed the rules should be saved no matter what. This would be true even if that wasn't the case because no one would buy a car that would kill you.
There is some point to that...

Other than the case of small children, this could lead to some negligent and potentially murderous behavior. Consider that 100% of the cars are equipped with programs like these (as others has pointed out, self-driving cars are a hard sell unless there is a considerable amount of self driving cars on the road), then someone simply walking on the highway could be responsible of massive amounts of accidents.

Then again, no half decent self-driving software should put itself in that position, since in all scenarios the option to slow down or apply the brakes are preferable than "evaluating survivability" and running into a wall, especially since those cars should have the sensors to detect pedestrians at a decent distance.
 

Adam Jensen_v1legacy

I never asked for this
Sep 8, 2011
6,651
0
0
This isn't a moral dilemma. Cars are tools. If it's my car it should serve me, not anybody else. I'm not gonna drive a car that can suddenly turn into a moral arbiter/executioner that gets to decide how important my life is compared to the lives of others.
 

hermes

New member
Mar 2, 2009
3,865
0
0
inu-kun said:
I've heard about this dillema from a coworker and it makes absolutely no sense, the entire idea of a self driving car is that it will behave as a responsible driver, meaning driving in speed that will allow stop at any conceivable case, so nothing short of people teleproting in front of the car while it's on a high way on a rainy day will actually necessetate this kind of dillema, at real life the worst is a swerve that will scratch the paint job.
Exactly. No half-decent self driving car should be going at such a speed that it can't react realistically to maneuver from finding an obstacle on the street above its range. To program it that way would be negligent on the part of the manufacturer.
 

Floppertje

New member
Nov 9, 2009
1,056
0
0
I agree with the general sentiment here; a car should not prioritize others over its driver (or owner, since the driver would be the car itself).
I think that the more important point is that self-driving cars will not be as dangerous as people think. Yeah, at some point a computer (or rather, a programmer) might decide over life or death but it'll at least be able to make that decision (people don't really think that fast or clearly in that situation) and self-driving cars won't BE in that situation as often as human drivers are.
Robot cars don't have to be perfect. They just have to be better than human drivers, and that isn't really that hard to pull off...
 

Kenjitsuka

New member
Sep 10, 2009
3,051
0
0
"[Participants] were not as confident that autonomous vehicles would be programmed that way in reality-and for a good reason: "

Who is gonna MAKE cars that decide the driver (OWNER: = BUYER) is more/most disposable?
People would want everyone ELSE to drive those cars that safe persons outside of it, but no one is gonna *board* one themselves...
 

Souplex

Souplex Killsplosion Awesomegasm
Jul 29, 2008
10,312
0
0
If someone is injured by an autonomous car do they sue the programmer, the manufacturer, the owner or all of the above?
 

josemlopes

New member
Jun 9, 2008
3,950
0
0
Kill everyone that is breaking the law by crossing the road outside of a crosswalk.

The driver, or anyone outside the road, did nothing wrong, why should innocents pay?

This incident wouldnt happen if people followed the law, the ones that dont live in danger of killing themselfs.
 

happyninja42

Elite Member
Legacy
May 13, 2010
8,577
2,985
118
Adam Jensen said:
This isn't a moral dilemma. Cars are tools. If it's my car it should serve me, not anybody else. I'm not gonna drive a car that can suddenly turn into a moral arbiter/executioner that gets to decide how important my life is compared to the lives of others.
I think the OP is referring to the moral dilemma of the people programming the protocols for the car.

OT: Yeah, I don't really see any automated driving system that would incorporate pedestrians in it. All of the prototype plans I've heard presented involve some kind of elevated or subterranean driving network for the vehicles, where the traffic is 100% controlled by the system. Which would drastically reduce the chance for mishaps.

I'm all for automated vehicles, as the number one cause of vehicular accidents is the driver doing something they shouldn't. I've yet to see an accident where everyone involved were following all of the traffic and safety guidelines. This fear of automated machines somehow causing more accidents, and being worse at driving than humans is laughable, and counter-intuitive to me.