Strategy. Innovation. Brand.

ethics

Ethics And The Self-Driving Car

Why does it have to look like a car?

Why does it have to look like a car?

In last night’s critical thinking class, we discussed and debated how to instill ethics into self-driving cars. To kick off the debate, I adapted a question from an article by Will Knight in a recent issue of Technology Review.

If a child runs in front of a self-driving car, is it ethical for the car to injure or kill its own passengers in order to avoid the child?

The question split the class roughly in two. Some students argued that the car should avoid the child at all costs – even at the risk of serious injury or death to the car’s passengers. Others argued that the life of the person in the car was at least equal to that of the child. And if the car contains more than one passenger, wouldn’t multiple lives outweigh the value of one child’s life? Why should the child alone be protected?

The students also weighed the value of different lives. I asked them to assume that I was in the car. So the question became: Which is more valuable, the life of a child or the life of a mature adult? Some argued that the child was more valuable because she has many years of life yet to live. By comparison, I have lived many years and “used up” some of my value.

Others argued exactly the opposite. A mature adult is a storehouse of knowledge and wisdom. To lose that would be a blow not only to the individual but also to the community. The death of a child, though tragic, would eliminate only a small quantity of wisdom.

The question then evolves to: Could we have different ethical systems in cars that are used in different cultures? For instance, could cars that are used in cultures that value older people as fonts of wisdom, use an ethical system that protects adults over children? At the same time, could cars used in youth-oriented cultures, opt to protect children ahead of adults? What are the ethics here?

As you may have noticed, we were all making one key assumption: that self-driving cars will be delivered with an ethical system already installed. Let’s change that assumption. Let’s say the car arrives in your driveway (it delivers itself presumably) with no ethical rules programmed into it. The first thing we do is to program in the rules that we believe are the most ethical. The rules I program into my car may be different than the rules you put in your car.

This is, of course, exactly what we do today. When I drive my car, it’s controlled by my ethical system. Your ethical system controls your car. Our ethics may be different, so our cars “behave” differently. We’re quite used to that in everyday life, partially because the rules are not explicit. But as we explicitly program ethics into cars, is that really the way we want to do it?

Pushing on, we also addressed questions of self-driving cars with no passengers. We generally assume that self-driving cars will have passengers and at least one of those will be a responsible adult. But do they have to? Here are two variants:

  • I want a pizza. Is it ethical for me to send my self-driving car to the local pizza place to pick it up for me? Conversely, could the pizza place simply deliver it to me in a self-driving car with no human in it?
  • Is it ethical for a Dad to place his two kids in a self-driving car and instruct the car to deliver them to their primary school … without Dad going along? Could self-driving cars become child delivery systems?

I think we can find many more questions here. To wit: Can you own a self-driving car or is it a community resource to be shared? Does a self-driving car have to look like a regular old car? Why? (Here’s an example of one that doesn’t).

There’s a lot to think about here. Indeed, we may find that it will be more difficult to solve the ethical problems than technical problems. I’ll write more about these issues in the future. In the meantime: Safe driving!

My Social Media

YouTube Twitter Facebook LinkedIn

Newsletter Signup
Archives