The Moral Dilemma of Self-Driving Cars - Save the Driver Or Others

RoguelikeMike

New member
Nov 10, 2014
5
0
0
By the way, I think its absolutely ridiculous to expect a computerized car to have to be properly programmed for every possible situation. Every time you put a teenage behind a wheel you're putting a malfunctioning, broken, computer into a car that cannot be trusted to obey any laws or rules.

Basically as chaotic and worthless a program as could be used would still make better choices than human beings, the outcomes would still be better, and everyone would still benefit.

Its always people that think they are special (i.e. everyone) that cleverer, smarterer, betterer than everyone else that have reason to fear computerized vehicles. "I could react better in that situation" "I have fast reflexes" "I would make better choices" "I would" "I can" "I am". Me. Me. ME.

Don't you people see you are the problem? You're why these things are an absolute requirement. As the roads flood every decade with more and more cars, each driver thinking they're safe, fast, intelligent geniuses, but are they? No. They're driving death machines, usually distracted, bored, tired, upset, hungry, over-worked, sleep-deprived, etc. Not operating at their best, as much as they would declare to the world they THEY alone are special, and that nothing bad would happen to them, no, they're GREAT drivers, and a computer would be terrible. They are SO self impressed and self bloated that they actually feel fear at the idea of losing their control of the vehicle to a computer. How sad.
 

umbr44

New member
Aug 27, 2014
7
0
0
Some of the responses in this thread scare me. Sure I can see the apprehension to buying a self driving car that wont always prioritise your life (I'll never get one because I wont trust it to make the decision I would in accidents)but quite honestly, if a group of 10 schoolchildren don't bother to look and just walk out in-front of my car I'm not going to plow through them because they didn't follow the rules, I'm going to steer into a wall or something.

My life as a 23 year old bloke (who doesn't hold the cure to AIDs or world peace in my head) doesn't come above 10 stupid school children who forgot to look before crossing the street as a group. No sane person would program a car the kills 10 children to save the driver...it's just ridiculous.

And that's when it's their fault, there's plenty of circumstances where people fall into the road, are they also fair game no-matter their age because they had the audacity to not be driving a car?


In reality, most people swerve when something/someone comes out in-front of them and there is no time to stop, human reflexes put the other person above ourselves in most circumstances, so should anything that is programmed. Now WHAT the precise response is, and when it takes control is different for everyone and why I would never trust a car to make the choice for me. I'll hit an oncoming lorry to avoid a pram that fell in the road, I wont do it to avoid an 80 year old who just steps out.
 

MCerberus

New member
Jun 26, 2013
1,168
0
0
If you buy an objectivist car, it cuts people off at 90mph and automatically sues all the cars on the road.
If you buy a relativist car, it drives on the wrong side of the road sometimes when it feels like it
If you buy a duty ethics car, it can never change lanes because it's afraid of messing with other cars
If you buy a determinist car, it drives off the nearest bridge because it was programmed to.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
bladestorm91 said:
Here's the thing. If a self-driving car is following all the rules and it comes to the situation mentioned in that research, then the car and driver are not at fault, but the ones who are crossing the road.

That makes things simpler, the people crossing the road can go to hell. You and the car who followed the rules should be saved no matter what. This would be true even if that wasn't the case because no one would buy a car that would kill you.
and thats basically the answer. provided that the car did not break any rules, there is no reason for it to sacrifice itself. if it did not break any rules the only way this situation could occur is if others broke the rules and in this case those others are at fault for it and the results should be on them instead of the one that did not break any rules.

inu-kun said:
I've heard about this dillema from a coworker and it makes absolutely no sense, the entire idea of a self driving car is that it will behave as a responsible driver, meaning driving in speed that will allow stop at any conceivable case, so nothing short of people teleproting in front of the car while it's on a high way on a rainy day will actually necessetate this kind of dillema, at real life the worst is a swerve that will scratch the paint job.
clearly you have not met the crazy "im a pensioneer therefore i cross the road wherever i want" people that will just jump in front of you even if there are no crossing area there. they dont even stop to look. though i dont expect their life expectancy to be very long.



Souplex said:
If someone is injured by an autonomous car do they sue the programmer, the manufacturer, the owner or all of the above?
noone, because the only way they would be injured is by breaking the rules and themselves being at fault. assuming the car programming is working as intended, if not the manufacturer is going to be at fault just like they would be if they sold you a car without breaks.

zidine100 said:
Which will probably end up being self driven when this becomes a thing. Because hey why not have a one off cost in buying a vehicle that's self sufficent and can work 24 hours a day, instead of having to deal with pesky employees who you have to pay, give time off, deal with them striking ect ect.

just saying.
self driving public transport will come sooner than self driving cars anyway. we already have self driving metros in some cities.

RatGouf said:
This sounds about as goofy as someone getting car sick because they're not driving the vehicle. Even weirder when those 3 or more passengers with the driver never get car sick.
car sick is a thing. while more common in kids its true for some (small amount) of adults. one of my friends cannot drive in a car, as a passenger (does not have drivers license obviuosly so no idea about a driver) if the drive is over 30 minutes. she will literally start vomiting. "Big" transport such as busses trollers and trails shes more ok with. they tend to swerve less i guess.

though i agree that self-driving cars are not going to affect this at all.


RoguelikeMike said:
I believe that people have no business owning vehicles, and shouldn't have the right to drive them. Why do I think this? Well first of all, because I have been a pedestrian for 28 years. I'll tell you this right now, that everyone, everyone, everyone without fail, everyone that sits behind the wheel of a car has no respect for anyone else. Everyone, even you. You, the one reading right now, thinking you do. You don't. How do I know this? Because I cross a double crosswalk every day. This cross walk tells the turning lane they can go at the same time as it tells the pedestrians they can cross. I stand there for 2-3 minutes often times waiting for the light to change, the people in the turning lane know I am there. They have 2-3 minutes to see me. BUT THE LIGHT IS GREEN! So they go. And I get to watch as my 15 seconds to cross tick by, and the turning lane keep driving on as if I don't exist. I HAVE TO RUN INTO THE MIDDLE OF THE ROAD, and STOP cars with my body in order to get to my job. EVERY DAY. WITHOUT FAIL.
Funny, i cross just such a crosswalk at least 4 times a day and am yet to see a situation like that despite crossing it for the last 8 years. maybe the problem is driving culture in your city insstead? once i even saw a president of my country stop at that turn and let me (and other passengers) by. or well at least it was the presidents car as the glasses dont let me see inside.
 

maddawg IAJI

I prefer the term "Zomguard"
Feb 12, 2009
7,840
0
0
RoguelikeMike said:
By the way, I think its absolutely ridiculous to expect a computerized car to have to be properly programmed for every possible situation. Every time you put a teenage behind a wheel you're putting a malfunctioning, broken, computer into a car that cannot be trusted to obey any laws or rules.

Basically as chaotic and worthless a program as could be used would still make better choices than human beings, the outcomes would still be better, and everyone would still benefit.

Its always people that think they are special (i.e. everyone) that cleverer, smarterer, betterer than everyone else that have reason to fear computerized vehicles. "I could react better in that situation" "I have fast reflexes" "I would make better choices" "I would" "I can" "I am". Me. Me. ME.

Don't you people see you are the problem? You're why these things are an absolute requirement. As the roads flood every decade with more and more cars, each driver thinking they're safe, fast, intelligent geniuses, but are they? No. They're driving death machines, usually distracted, bored, tired, upset, hungry, over-worked, sleep-deprived, etc. Not operating at their best, as much as they would declare to the world they THEY alone are special, and that nothing bad would happen to them, no, they're GREAT drivers, and a computer would be terrible. They are SO self impressed and self bloated that they actually feel fear at the idea of losing their control of the vehicle to a computer. How sad.
Your argument, the human beings are inferior drivers compared to a computer, requires that traffic patterns be constant and cannot be changed, which we both know is incorrect. Every time I put a teenager behind the wheel, I may be putting a defective computer behind the wheel, but said defective computer can adjust for changes in road conditions. Would a smart car be able to tell when construction is going on and where the lanes have been shifted? Can a self-driving car recognize when a stop light is out of order and an officer is conducting traffic instead? Can a self-driving recognize when a road is flooded or when a street has been remade into a one-way? Can it recognize when a street is closed for construction or an accident? What about changes in speed limit for construction? Would it, in its effort to stay on the road, drive intentionally into pot-holes and possibly damage itself? How would it deal with parking laws and parking lots? How often would its gps be updated? These are just a few problems that would arise off the top of my head, it doesn't even mention the fact that computerized robots are, at the moment, extremely clunky and slow, two factors that don't work on the road. Its not a matter of a car being expected to work in every scenario, but rather common ones that are ever present in cities.

Yeah, I may think i'm a better driver than everyone else, when in reality, I may be sub-par, but then again, so are the conditions that I drive in.