The Moral Dilemmas of Self-Driving Cars

Self-Driving Cars

By Michael Shick (Own work) [CC BY-SA 4.0 (], via Wikimedia Commons

As automotive companies work towards getting autonomous vehicles (AV) on the road, there are questions concerning ethics that the public should be asking. According to Iyad Rahwan, an MIT cognitive scientist, “Every time the car makes a complex maneuver, it is implicitly making a trade-off in terms of risks to different parties.”

An important question in AV ethics is whose lives should be sacrificed in the event of an unavoidable crash. For example, a pedestrian falls in front of an AV, and the AV has the option of swerving into a traffic barrier, which can kill the passenger, or continue going straight, which can kill the pedestrian. How does the AV decide what to do?

Another ethical question to ask is what do AVs do when they are passing a cyclist or a pedestrian? As humans, we typically make the decision to give cyclists and runners extra space when we drive past them because we think it’s safer. Humans make these ethical decisions based on intuition, which is not a feature that can be programmed for artificial intelligence. The programmers must create and define precise rules for each situation or rely on general driving rules.

Rahwan says, “On one hand, the algorithms that control the car may have an explicit set of rules to make moral trade-offs. On the other hand, the decision made by a car in the case of unavoidable harm may emerge from the interaction of various software components, none of which has explicit programming to handle moral trade-offs.”

In many cases, AV companies are trying to avoid questions on how they are handling these type of ethical issues. Consumer Watchdog’s Wayne Simpson, an AV skeptic, points out, “The public has a right to know when a robot car is barreling down the street whether it’s prioritizing the life of the passenger, the driver, or the pedestrian, and what factors it takes into consideration. If these questions are not answered in full light of day … corporations will program these cars to limit their own liability, not to conform with social mores, ethical customs, or the rule of law.” Before self-driving cars become a form of mainstream transportation, society need to address these ethical concerns that could lead to considerable controversies and lawsuits.