Who Will Driverless Cars Decide to Kill?
In philosophy, there’s an ethical question called the trolley problem. If you had to push one large person in front of a moving trolley to save a group of people on the tracks, would you? This abstract idea has taken hold in programming self-driving cars: what happens if it’s impossible to avoid everyone?
Researchers from the Toulouse School of Economics decided to see what the public would decide, and posed a series of questions to online survey-takers, including a situation where a car would either kill 10 people and save the driver, or swerve and kill the driver to save the group.
They found that more than 75 percent supported self-sacrifice of the passenger to save 10 people, and around 50 percent supported self-sacrifice when saving just one person. However, respondents didn’t actually think real cars would end up being programmed this way, and would probably save the passenger at all costs.