With the introduction of self-driving vehicles, developers behind their decision-making AI have had to re-think some age-old ethical dilemmas, specifically who should the car choose to save in the event of a crash. An article published by Nature: International Journal of Science details the results of The Moral Machine experiment, which confronted over two million participants with a variety of hypothetical moral dilemmas as faced by an autonomous vehicle, its passengers and nearby pedestrians. For instance, participants were presented with the graphic shown below and asked which of the two choices would be preferable in the event of brake failure: the death of three elderly pedestrians illegally crossing the road, or the death of the young family in the car. Through the recording of almost 40 million decisions via this experiment, the researchers focused on nine distinct factors: sparing humans versus pets staying on course versus swerving sparing passengers versus pedestrians sparing more lives versus fewer sparing men versus women sparing the young versus the elderly sparing legal pedestrians versus jaywalkers sparing the fit versus the less fit sparing those with higher social status rather than lower From all of the responses, no matter which country or demographic they came from, the strongest preferences were to spare human lives rather than pets, save more lives versus fewer, and saving younger lives rather than the elderly (in that order). While this may seem obvious, the decision to implement these preferences into autonomous driving software isn’t as straightforward. The ability to detect… [Read full story]
You are here: / / Self-driving cars should prioritise young lives, according to new global study
TechRadar is an online publication focused on technology, with editorial teams in the US, UK, Australia and India. It provides news and reviews of tech products and first launched in 2008.