Driverless cars pose moral “dilemmas”
In the event of an unavoidable crash, should a driverless car be programmed to save as many lives possible? Or instead, should it protect its occupants rather than others? That type of “social dilemma” is being pondered by social scientists, government regulators and manufacturers as the commercial debut of autonomous cars draws closer in California and nationwide.
In a study published in June, researchers surveyed 1,928 consumers on how they would want driverless cars to respond in various crash scenarios. For example, participants were asked what an autonomous car should do if pedestrians dashed in front of it and the only option to avoid hitting them was to swerve into a wall, endangering the car’s passengers. Slightly more than 75 percent of the respondents believed driverless cars should be programmed to be “utilitarian” and try to save the most lives, possibly at the expense of its passengers. However, 81 percent said they would rather purchase a car that would protect them and their families in all cases.
A co-author of the study noted that, while the public may intellectually recognize the moral superiority of the utilitarian model, most consumers will want to purchase a car that is programmed to protect them in all situations. Therefore, car manufacturers may have incentive to sell vehicles that always place passenger lives first, but government regulators could have a different viewpoint on the matter. The National Highway Traffic Safety Administration said it plans to issue updated guidelines on self-driving cars and the underlying technology.
While the auto-pedestrian accident posed in the survey was a hypothetical one, unfortunately thousands of people are injured in real ones each year. A pedestrian who had the right of way but who incurred serious injuries after being hit by a car may want the assistance of counsel in seeking compensation from the at-fault motorist.
Source: CNN, “Driverless cars create a safety ‘dilemma’: passengers vs. pedestrians,” Jacqueline Howard, June 23, 2016