Jump to content

Moral Machine


Guest

Recommended Posts

Self driving cars can evaluate unavoidable crash scenarios, make decisions and bailout in the least harmful manner at lightning speed. The fact that autonomous vehicles will be safer than the human controlled variety is inevitable and uninteresting. What is quite fascinating and possibly contentious is the ethical decisions that will have to be programmed in to these machines.

Does the age, gender, species, number, legality, profession or fitness level of the potential victims matter and if so what combinations trump others? This website has you evaluate several scenarios and then ranks your decisions against the average. Apparently, I favour the young, females, humans, the fit, doctors and have little regard for animals or whether or not people are obeying traffic rules.

Check it out: http://moralmachine.mit.edu/

Do you differ significantly from the norm or my responses? Should these cars be programmed with universal ethics or have downloadable regional and cultural settings?

Edited by Guest
Link to comment
Share on other sites

Interesting. I didn't kill any pets, but anyone else was fair game, albeit with a slight preference for protecting women and kids.

My first choice is that the car makers are forced to use the program as dictated by government policies. Short of that, the program should prioritize the heath and safety of the owner/driver over all other choices.

Link to comment
Share on other sites

I always put human life over life of pets. I usually favored the pedestrians over the car riders. If you choose to drive a car and your breaks fail, what ethical right do you have to swerve and go out of your way to kill innocent pedestrians to save your own life?

It's not a matter of ethical rights but a matter of what people want. Would you get in a car that you knew was programmed to kill you if in doing so it saved others? I wouldn't. The split second response of a driver faced with an unexpected imminent collision will be to do whatever they can to save themselves... usually that reaction will just be to hit the brakes, which is a good reaction because it reduces the collision velocity and thus the chance/severity of injuries. A self-driving car should make decisions in a similar way to what one would expect a human driver to do, but with the benefit of faster response times and sensors that are constantly monitoring all directions with perfect attention, the safety should be much improved.

A human faced with an imminent collision with a crowd of pedestrians in seconds isn't gonna try to identify the unfit ones or the old ones and aim for them while trying to avoid others, they are just gonna slam the brakes. If the brakes are broken as in the scenarios at the link, they likely won't have time to try anything else before the collision occurs. The self-driving car on the other hand could react more quickly. But the appropriate reactions would be to try the emergency brake, or try shifting the vehicle into a low gear, park, or reverse to generate as much braking as possible at the cost of mere damage to mechanical systems. The appropriate reaction by the algorithm would most certainly NOT be to try to do image recognition, identify targets of lower social value, and go full speed into them, and any company that sold a car programmed with such an algorithm would be faced with outrage and criminal charges.

Frankly, all the pseudo-intellectual discussions on self-driving car ethics are silly. People want cars that will act the way they themselves would act but better, faster, and without their personal effort, not some programmer's ethics imposed on them.

Link to comment
Share on other sites

A human faced with an imminent collision with a crowd of pedestrians in seconds isn't gonna try to identify the unfit ones or the old ones and aim for them while trying to avoid others, they are just gonna slam the brakes.

Yes, and is probably why I decided to save all the pets. I couldn't ethically decide the "large woman" was more deserving of life than the "female athlete", but dogs/cats are universally innocent (imo and all that) so I made my decision based on that, whenever I could. After that came kids - based again on innocence, and women just because I am one.

If the scenarios were real, I'd try to steer between the two groups of pedestrians, thereby killing or maiming some bit of both groups and in the scenario where the choice is the group of pedestrians or a brick wall, I'd probably aim for the wall, figuring that myself and passengers have a better chance at survival in the car than would pedestrians as my car plowed into them.

Link to comment
Share on other sites

It's not a matter of ethical rights but a matter of what people want. Would you get in a car that you knew was programmed to kill you if in doing so it saved others? I wouldn't. The split second response of a driver faced with an unexpected imminent collision will be to do whatever they can to save themselves...

But not everyone would do that. There's people I know who wouldn't swerve into pedestrians to prevent to prevent themselves from hitting a lethal obstacle. And there's others that would.

Frankly, all the pseudo-intellectual discussions on self-driving car ethics are silly. People want cars that will act the way they themselves would act but better, faster, and without their personal effort, not some programmer's ethics imposed on them.

Well then maybe the answer is for everyone to complete this kind of ethical quiz when they buy their car, and then upload those choices to a network so that all the other self-driving cars will know every other self-driving cars' behaviour in such circumstances to make everything more predictable and safer.

Link to comment
Share on other sites

But not everyone would do that. There's people I know who wouldn't swerve into pedestrians to prevent to prevent themselves from hitting a lethal obstacle.

There's plenty of people who, given the moral dilemma of saving themselves vs saving others as an intellectual exercise, would be altruistic and choose to save others. And, some of them may even follow through in practice when presented with a scenario where they have to make such a choice.

But an imminent collision in a vehicle moving fast enough to be fatal doesn't give you time to consider ethical dilemmas. It doesn't give you time to choose to be selfish or altruistic. It doesn't even give you time to look around and see what all your possible options are. All you will see is "OMG THERE'S A WALL IN FRONT OF ME" and hit the brakes. If the brakes don't work, you might, if you are exceptionally skilled and have exceptionally fast reflexes, realize this just before impact and have time to try to swerve around the obstacle. By this time, you won't have time to check what else might be there. And you CERTAINLY will not have time to see what kind of people are there and evaluate which ones society will least miss if you happen to kill them.

The self-driving car should increase safety through its better performing sensors and reflexes as compared to a human, not through engaging in ethical dilemmas. The TYPE of action the car takes should be analogous to what a reasonable human driver would do, but the result will be better because the action will be taken sooner and will be based on all available information rather than the subset of information that the human driver may have perceived. No driving course talks about how to choose which kinds of people you should crash into it or discusses ethical dilemmas... instead they emphasize keeping attention on the road, scanning for hazards, driving defensively, being ready to hit the brakes when there are potential hazards or obstacles, etc. The self-driving car will do all these things, but better than a human can.

Link to comment
Share on other sites

A human faced with an imminent collision with a crowd of pedestrians in seconds isn't gonna try to identify the unfit ones or the old ones and aim for them while trying to avoid others, they are just gonna slam the brakes. If the brakes are broken as in the scenarios at the link, they likely won't have time to try anything else before the collision occurs. The self-driving car on the other hand could react more quickly. But the appropriate reactions would be to try the emergency brake, or try shifting the vehicle into a low gear, park, or reverse to generate as much braking as possible at the cost of mere damage to mechanical systems. The appropriate reaction by the algorithm would most certainly NOT be to try to do image recognition, identify targets of lower social value, and go full speed into them, and any company that sold a car programmed with such an algorithm would be faced with outrage and criminal charges.

Frankly, all the pseudo-intellectual discussions on self-driving car ethics are silly. People want cars that will act the way they themselves would act but better, faster, and without their personal effort, not some programmer's ethics imposed on them.

I agree, but if the computer knows a collision is avoidable, and it collide with another car instead of the crowd of people it might be reasonable to hit the other car.

But even just an overall directive to "avoid collisions" will make computer drivers way safer than people. The computer wont be eating a sandwich, smoking a cigarette, talking or texting on a cell phone, or putting on makeup. It wont be drunk either.

Link to comment
Share on other sites

The self-driving car should increase safety through its better performing sensors and reflexes as compared to a human, not through engaging in ethical dilemmas. The TYPE of action the car takes should be analogous to what a reasonable human driver would do, but the result will be better because the action will be taken sooner and will be based on all available information rather than the subset of information that the human driver may have perceived.

Well the thing is the self-driving car is capable of making decisions a lot faster than a humans. Yes I understand there's lots of situations humans don't have time to think but only react. Your stance is that self-driving cars should behave like humans in dangerous situations, but just faster. Ok that's valid.

But then, like I said, self-driving cars can make faster decisions and therefore there's an opportunity to potentially save more lives in defensive driving etc. Humans have some stupid reactions too that can be improved on. I dunno, I can see your side but I see other sides too. It's a very interesting topic, I want to see how it plays out in real-life a bit first.

Link to comment
Share on other sites

A sky train full of passengers is heading to the next station. It has a ticking time bomb on board and all the passengers will die if it doesn't get to the next station in time so they can offload. But on the way there, there are innocent people tied to the tracks, in the same number as the sky train passengers. If the train stops, these innocents will be spared but the passengers will die. What should the sky train do?

Do we really have to carefully consider this ethical dilemma and program the sky train accordingly? Or can we simply design a system that is safe enough under normal conditions? Society has already decided on the answer, and it applies just as much to the self driving car.

Edited by Bonam
Link to comment
Share on other sites

"If you were up to your neck in sewage and someone started to pee on your head, would you duck?"

On a more serious note, If I am paying for a product than I would expect it to favor me - and my family. Do you really want someone creating software in your car that would prioritize other people over your wife and children? Not if I am paying for it.

Edited by Big Guy
Link to comment
Share on other sites

A sky train full of passengers is heading to the next station. It has a ticking time bomb on board and all the passengers will die if it doesn't get to the next station in time so they can offload. But on the way there, there are innocent people tied to the tracks, in the same number as the sky train passengers. If the train stops, these innocents will be spared but the passengers will die. What should the sky train do?

Do we really have to carefully consider this ethical dilemma and program the sky train accordingly? Or can we simply design a system that is safe enough under normal conditions? Society has already decided on the answer, and it applies just as much to the self driving car.

The skytrain wouldnt know about the bomb anyways but if it did it should stop immediately and offload people onto the tracks to be evacuated via service entrances. It shouldnt go to the next station because there will likely be a shitload of more potential victims there.

A better analogy is the car knows there's going to be an accident because the road is block by a truck, a car, a biker, and a pedestrian and there's ditches on both sides.. It would be reasonable if the car decided to the airbags, and hit the ditch.

Edited by dre
Link to comment
Share on other sites

The skytrain wouldnt know about the bomb anyways but if it did it should stop immediately and offload people onto the tracks to be evacuated via service entrances. It shouldnt go to the next station because there will likely be a shitload of more potential victims there.

Right, so what you are doing is coming up with reasonable ways out of the stupid artificial concocted scenario. Just like one could come up with analogous ways out of the silly scenarios at the OP link. Why not just go onto an empty section of the sidewalk? Hit the emergency brake? Do a U-turn? Again, the point is the course of action should never come down to an "ethical dilemma" for a computer to solve, ever. Just do the obvious things that make sense.

Link to comment
Share on other sites

I favour the young... females

a slight preference for protecting women

It's strongly ingrained in society that the lives of women matter more than the lives of men. This result doesn't surprise me, but is indicative of a much bigger set of problems.

Link to comment
Share on other sites

It's strongly ingrained in society that the lives of women matter more than the lives of men. This result doesn't surprise me, but is indicative of a much bigger set of problems.

Yeah, I read some of the men's right stuff too. Men do still have the majority of the power within our society, but it's also true that men in lower echelons have some obstacles to overcome on the way to true gender equality. Women did have to create their own movement to highlight the inequity; unfortunately, men have not had as much success in doing that for themselves. I'm not sure what the reasons are, perhaps because women are better at building community? Or maybe enough men retain enough power that those disenfranchised men are ignored? Anyway, I was once looking for a men's rights group on behalf of my son and could find nothing local. I did find a group in Toronto, but they ignored my phone and email messages.

Link to comment
Share on other sites

Women did have to create their own movement to highlight the inequity; unfortunately, men have not had as much success in doing that for themselves.

I think one of the main reasons is that women complaining about things is consistent with gender roles. Men complaining about things isn't. Men are expected to man up and quit complaining or whatever.

Or maybe enough men retain enough power that those disenfranchised men are ignored?

This too.

Edited by -1=e^ipi
Link to comment
Share on other sites

I'd take the pigs over Pickton.

Sure, It's easy to say ; I'd take (or save) an animal over Pickton or a Matador...or Charles Manson, Casey Anthony or Al Sharpton for that matter. But given that I doubt you're able to set your car to make a decision based on the person's name, profession or political belief system.

If you could set your car to swerve the ducklings and hit the protesting liberals, fair enough, but what happens if there is someone in that crowd who is not actually a liberal?

No, I think the car would have to be programmed that all people are more valuable than animals.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Tell a friend

    Love Repolitics.com - Political Discussion Forums? Tell a friend!
  • Member Statistics

    • Total Members
      10,722
    • Most Online
      1,403

    Newest Member
    phoenyx75
    Joined
  • Recent Achievements

    • User went up a rank
      Contributor
    • User earned a badge
      Week One Done
    • Fluffypants earned a badge
      Very Popular
    • User went up a rank
      Explorer
    • gatomontes99 went up a rank
      Collaborator
  • Recently Browsing

    • No registered users viewing this page.
×
×
  • Create New...