Accessibility links

Breaking News

MIT Seeks 'Moral' to the Story of Self-Driving Cars


MIT's Moral Machine helps programmers understand moral choices self-driving cars may have to make.
MIT's Moral Machine helps programmers understand moral choices self-driving cars may have to make.
MIT Seeks 'Moral' to the Story of Self-Driving Cars
please wait

No media source currently available

0:00 0:05:11 0:00

Every time we get behind the wheel of a car, we put our lives and the lives of others at risk. Self-driving cars are designed to reduce those risks by letting technology control our vehicles.

Accident rates for self-driving cars have been much lower than the rates for human-driven cars. Google's self-driving car has had only 13 collisions after traveling 1.8 million miles.

As humans, we can make moral choices in avoiding accidents. To avoid hitting a child, for example, human drivers might sharply turn a car away from the child even if others may be injured.

But what moral choices can self-driving cars make?

Researchers at the Massachusetts Institute of Technology, MIT, studied this issue. They have developed the Moral Machine website to help explore the choices self-driving cars should make. MIT says the Moral Machine is "a platform for gathering a human perspective on moral decisions made by machine intelligence such as self-driving cars."

You can use the Moral Machine to be the judge and help self-driving cars decide what to do in different driving scenarios. Then, see how your choices compare to those others have made.

Moral Machine choices can help MIT researchers examine the decisions self-driving cars should make. Users can also take part but not have their choices included in MIT research.

How the Moral Machine Works

A video on YouTube introduces Moral Machine:

The Moral Machine website lets you choose how you would react in a collision.

You are shown two possible traffic situations and you choose between them. In all situations, the brakes have failed on the self-driving car. The car will remain driving in the current direction or it will turn. An accident will take place. You choose how many living beings would be hurt or killed, and who those victims should be.

In one situation there may be a female doctor, a child, two dogs and a homeless person who would be killed. In the other situation, you might have two babies and a cat who would be killed. You choose who lives and who dies.

The Moral Machine website has many situations and many possible outcomes.

MIT Moral Machine Scenarios
MIT Moral Machine Scenarios

When you click on the situation of your choice, it will be highlighted. Then the next situation appears.

At the end of the session, you are shown the results, based on the choices you made. The results show which character you were most likely to save and which character you were most likely to have die. Results also show how much each factor means to you, such as age, societal importance or obedience to traffic laws.

MIT Moral Machine Results
MIT Moral Machine Results

After you have finished, you are offered a chance to use the Moral Machine again. Each time you are shown different situations.

You can also create your own situations.

You can also look at situations other users have created. You do not make choices in these situations but a discussion section is connected. You can add your thoughts and read what others have posted about each situation.

I’m Caty Weaver.

Carolyn Nicander Mohr wrote this report for VOA Learning English. Caty Weaver was the editor.

Have you ever thought about the moral choices of self-driving cars? Have you tried Moral Machine? Were you surprised at your results? Share your thoughts in the Comments Section below or on our Facebook page.

______________________________________________________________

Words in This Story

platform - n. a program or set of programs that controls the way a computer works and runs other programs

perspective - n. a way of thinking about and understanding something (such as a particular issue or life in general)​

scenario - n. a description of what could possibly happen

collision - n. a crash in which two or more things or people hit each other​

outcome - n. something that happens as a result of an activity or process; result

database - n. a collection of pieces of information that is organized and used on a computer

XS
SM
MD
LG