The Moral Dilemma of Self-Driving Cars - Save the Driver Or Others

lacktheknack

Je suis joined jewels.
Jan 19, 2009
19,316
0
0
Space Jawa said:
I'll go one further and say I have no intention of ever buying a self-driving car at all.
Nailed it.

If self-driven cars become standard, I'm swapping permanently to buses and city trains, the end.
 

9tailedflame

New member
Oct 8, 2015
218
0
0
I don't know about anyone else, but i'm not driving a car that drives itself and doesn't prioritize my life.

I also kinda don't like the idea overall, it just feels a bit too "big daddy government's got his eye on you" for my taste. All sorts of things that could go wrong.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
That would be an interesting selling point of a car.

"Well sure, the other guy has a better gas efficiency rating but our car won't murder you if some asshole jumps out in traffic"
 

zidine100

New member
Mar 19, 2009
1,016
0
0
lacktheknack said:
Space Jawa said:
I'll go one further and say I have no intention of ever buying a self-driving car at all.
Nailed it.

If self-driven cars become standard, I'm swapping permanently to buses and city trains, the end.
Which will probably end up being self driven when this becomes a thing. Because hey why not have a one off cost in buying a vehicle that's self sufficent and can work 24 hours a day, instead of having to deal with pesky employees who you have to pay, give time off, deal with them striking ect ect.

just saying.
 

Qizx

Executor
Feb 21, 2011
458
0
0
inu-kun said:
I've heard about this dillema from a coworker and it makes absolutely no sense, the entire idea of a self driving car is that it will behave as a responsible driver, meaning driving in speed that will allow stop at any conceivable case, so nothing short of people teleproting in front of the car while it's on a high way on a rainy day will actually necessetate this kind of dillema, at real life the worst is a swerve that will scratch the paint job.
Agreed.
I think it's important to note that if someone is somehow in front of my car without me having time to stop it's damn well their fault. If I'm on I-95 doing 70 and somehow a person is in front of me it's because they somehow decided to walk across one of the busiest highways. I am NOT dying for them.
 

Aeshi

New member
Dec 22, 2009
2,640
0
0
Yeah, you can add me to the "I'm not buying a vehicle that treats me as disposable" crowd.

And I'm willing to bet money that the people responsible for writing that particular bit of code won't either. Or that they'll try and "Jailbreak" said cars if they do.
 

RandV80

New member
Oct 1, 2009
1,507
0
0
lacktheknack said:
Space Jawa said:
I'll go one further and say I have no intention of ever buying a self-driving car at all.
Nailed it.

If self-driven cars become standard, I'm swapping permanently to buses and city trains, the end.
No offense but I doubt people with your sentiment will stick to it very long. Driverless or not riding the bus sucks. They're constantly stopping at every block or two to let people on or off, and you have to work on their schedule where catching the bus or at a transfer you could have to wait anywhere from 5-60 minutes depending on the location/city. So good luck sticking to a 90 minute transit route when a self driving car could get you there in 30. Oh and not to mention if it's a busy route during rush hour being packed in their like sardines.

Now if you have good rapid transit options such as trains that can be different that's close enough to you that's different. You'll still be packed in like sardines during rush hour, but these are usually faster than regular traffic.
 

Pyrian

Hat Man
Legacy
Jul 8, 2011
1,399
8
13
San Diego, CA
Country
US
Gender
Male
MTGorilla said:
In poor visibility conditions, how would a human fare any better?
Humans are terrible drivers, and I'm firmly in the pro-self-driving car camp because they will almost certainly net save tens of thousands of lives per year. Regardless, we know this about humans: we're overwhelmingly likely to think of our own safety first.

MTGorilla said:
The autonomous car has the advantage of infrared sensing equipment, which can easily tell the difference between a real living thing and a piece of cardboard blowing in the wind.
But not between a real living thing and a puff of exhaust; nor between a "real living thing" and a human being.

I don't think I would ever program in the "run into wall" case. Avoid obstacles, stay on the road. The obstacles themselves would be evaluated mostly on the ability to avoid for as long as possible, and otherwise minimizing the final collision speed, and maybe size. But trying to calculate all the possible and likely consequences? Basically impossible ("let's kill this guy, he's going to grow up to be the next Stalin") (Hitler is overused).

MTGorilla said:
...it's one thing to have the vehicle want to keep the occupants safe, and another to have a crazed Terminator-mobile that causally mows down pedestrians.
Eh, any collision is a danger, so it's not like it's going to go after pedestrians. ...Unless hacked...
 

Jacked Assassin

Nothing On TV
Jun 4, 2010
732
0
0
I don't get this moral dilemma at all. Are they building self driving cars without breaks? Or are they honestly expecting that a driver would also swerve or go head on without applying breaks?

This sounds about as goofy as someone getting car sick because they're not driving the vehicle. Even weirder when those 3 or more passengers with the driver never get car sick.
 

Tiamat666

Level 80 Legendary Postlord
Dec 4, 2007
1,012
0
0
Yes, it's an interesting problem, but I don't think we should obsess about it.

First of all, extreme situations like these are unlikely and will be extremely rare. Second, a decision has to be made; if not by the AI then by the more fallible human intelligence. And there is such a thing as the worst decision, such as, killing the maximum amount of pedestrians and the driver. Yes, an AI might weigh off the value of human lives against each other, but with good programming the decision will always be "a good one" causing the least possible amount of harm. With a human driver the chances of making a bad decision, including the ones that hurt the driver himself, will be much higher.

Fact is, a sufficiently advanced AI will always make better decisions than a human -especially- in stressful or time constrained situations.
 

fix-the-spade

New member
Feb 25, 2008
8,639
0
0
I think semi-autonomous like aircraft will become the norm rather than fully driverless. No buying customer is going to sit in a car that will try to kill them because their odds of survival in the crash are better than the other guy, that's a liability nightmare anyway. No company is going to sell a car in Europe than can be proven to have deliberately crashed itself, they'd be liable for all the damage.

No, auto pilot on a switch, deactivated when you make an input. Stuck in traffic, leave it on auto, motorway cruise, leave it on auto. Something unexpected happens, grab the wheel or hit a pedal and instant control.

Tiamat666 said:
Fact is, a sufficiently advanced AI will always make better decisions than a human -especially- in stressful or time constrained situations.
How advanced is 'sufficiently advanced'? Are we talking technology available now or Lt cmdr Data? Current AI lacks an ability to improvise, which is unfortunate because humans are great at creating absurd and/or unlikely scenarios.

Kross said:
There's also no way I'd ever want to use an automated vehicle unless every other vehicle on the road is also automated, as that swarm coordination is where the true benefit resides.
In complete seriousness, after maintaining servers or any kind of mass storage/data transfer infrastructure, why would anyone in their right mind trust that many moving objects entirely to automation?

If they're all automated, they all have to communicate with each other, their operator and some kind of central control. Under that kind of load even a nearly perfect system would fall on it's face constantly, I'm yet to see human being build a perfect system. I suppose it would keep you in a job.
 

Stupidity

New member
Sep 21, 2013
146
0
0
1.2 million people die every year in automobile accidents. 1.2 MILLION.
This moral dilemma represents the tiniest fraction of lives that would be saved by driverless cars.

If it takes giving driverless cars a license to kill to get them on the road than we should do it.
 

Kross

World Breaker
Sep 27, 2004
854
0
0
fix-the-spade said:
Kross said:
There's also no way I'd ever want to use an automated vehicle unless every other vehicle on the road is also automated, as that swarm coordination is where the true benefit resides.
In complete seriousness, after maintaining servers or any kind of mass storage/data transfer infrastructure, why would anyone in their right mind trust that many moving objects entirely to automation?

If they're all automated, they all have to communicate with each other, their operator and some kind of central control. Under that kind of load even a nearly perfect system would fall on it's face constantly, I'm yet to see human being build a perfect system. I suppose it would keep you in a job.
Everything in the same system is incredibly easier to maintain and safely fail out of then when there's uncontrolled and unknown variables. Systems start to break when human randomness gets involved, and the redundancy of multiple systems mutually distrusting while still working together will help detect problems much further in advance then otherwise.

Vehicles can also travel much faster when they know where every other vehicle is and how far they have to safely stop if an object appears in their path. I'd imagine part of such an effort would involve sensors adjacent to the road also reporting nearby objects to the vehicles.

You do still have to handle exceptions (never rely on anything to always work) like a person or animal wandering into the road, but in that case I'd be in favor of whatever results in the least damage to the vehicle and passengers taking action.

As far as keeping server administrators in jobs, pretty much every industry everywhere needs someone to configure their computers, and this kind of work sounds stressful on the level of Healthcare-related computer systems - which I'll pass on dealing with for the foreseeable future. ;)
 

Flames66

New member
Aug 22, 2009
2,311
0
0
I don't know about this situation, but it seems so unlikely that I wouldn't worry too much about it. Statistically it will happen to someone, but I don't want to waste my time trying to figure out a solution.

I quite like the idea of a self driving car, but I will never use one that has to be connected to a network to function. It must rely exclusively on software and hardware in the vehicle with no external input.

The same problem exists with phones. Knob: "my phone can translate between any language and read out the result for all to hear" Me:"can it do it while offline?" Knob: "no" me:"Then is can't do it".

josemlopes said:
Kill everyone that is breaking the law by crossing the road outside of a crosswalk.
Not illegal around here.
 

Neurotic Void Melody

Bound to escape
Legacy
Jul 15, 2013
4,953
6
13
Good luck trying to sell the automated car that is willing to sacrifice its' driver, you're going to need to bury that clause deep in the T&C first. It is a circular debate with no right answer, but if the car must sacrifice me to further the lives of some undeveloped snotbags who thought it'd be hilarious to play chicken, it had better record a decent farewell message and make a damn fine compensatory ice-cream for my family's grief/celebration. Maybe high-end models could record a personalised mix-tape exploring the peaks and troughs of coming to terms with our own mortality, they could even scent your corpse for a more pleasant extraction! It may look messy in there miss, but if we all just take a deep breath in, we can enjoy the honey and lavender of the situation.
 

Meximagician

Elite Member
Apr 5, 2014
608
127
48
Country
United States
Can't help but think, not of how these cars should be programmed, but how they will be. Place higher value on any customers in the scenario (including those not in the car) devalue any customers of competitors (especially if they're also in a driverless car) avoid PR disasters by striking those with less staying power in the mass media. My inner ethicist is terrified, my inner business man is laughing maniacally.

"Just imagine a Konami self driving car!" he keeps shouting, "Or a Volkswagen one! Or General Motors!"
 

Kaymish

The Morally Bankrupt Weasel
Sep 10, 2008
1,256
0
0
bladestorm91 said:
Here's the thing. If a self-driving car is following all the rules and it comes to the situation mentioned in that research, then the car and driver are not at fault, but the ones who are crossing the road.

That makes things simpler, the people crossing the road can go to hell. You and the car who followed the rules should be saved no matter what. This would be true even if that wasn't the case because no one would buy a car that would kill you.
i Agree its the people in the wrong who should pay the price and a self driving car is programmed to drive like such a Nana that it will almost never be in the wrong
and that being the case go and plow on through the fools standing in the middle of the road they chose death the passenger in the car and the pedestrian on the foot path did not and the motor cyclist was probably doing something stupid anyway so run right over him too servers him right :p