Autonomous cars are a reality on the roads today, and it might not be long before they become commonplace on America’s roads. This raises some serious issues though with the ethics of autonomous cars.
One of the much lauded effects of having autonomous cars on the road is the predicted decrease in in traffic related deaths. Currently over 34,000 Americans die in car accidents every year, the largest accidental killer of Americans. This, despite many advances in terms of safety cars have come a long way. Modern cars are equipped with many air bags, and increasingly sensors that can slow you down or detect a passing vehicle in you blind spot. But no matter what a person is still in control of the vehicle, and people make mistakes.
Autonomous cars remove the human element from driving and by doing so, theoretically, the number of traffic related deaths caused by humans behind the wheel will drop. But accidents will still happen, especially with robots and humans sharing the road, and lives will still be lost. But if you own a self-driving car, will you become immune from scrutiny?
Classical ethical problems like the Non-Identity problem, essentially the identities of future fatality victims, would change with the introduction of autonomous cars. Under Consequentialism sequence of thought, if there are more lives being saved then the end result is a positive and so a good ethical move for society.
Chaos theory, or the buttery effect, is another good one; anything we do could start a chain-reaction of other effects that result in actual harm or benefit to some person on the planet. Imagine a person was going to become a truck driver, but all truck drivers have been replaced by robotic trucks. So the person seeks a different path in life, and perhaps accomplishes something either great or evil. Remember, Hitler was a scorned artist before he was a genocidal dictator, and Napoleon was only an artillerly officer who wasn’t even born in France.
I think one of the best ethical problems to place the driverless car in is the Trolley Problem. In this example, in place of the trolley is a school bus loaded with kids and you behind the wheel of a car on a narrow road and you are going to crash. This is a no win situation, either you pull off the road and sacrifice yourself or you hit the bus killing the kids and yourself. If you are in an autonomous car, will the vehicle be able to make the decision to save dozens of young lives at the cost of yours? Probably not.
So, because of the decision made by the robot car, the kids are dead but you are saved. Flip that around, and if the car kills you to save the kids, without your consent, is the automaker at fault? Who gets blamed in this unlikely scenario? By having autonomous cars are the people just riding in them liable? Makes you think…
As a society we are quickly venturing into unknown waters in terms of the mass use of autonomous cars. While automakers such as Volvo are pursuing driverless cars with safety in mind, there are bound to be major legal ramifications in the deployment of hands-off vehicles. I for one find the prospects of the autonomous car very exciting and promising, but at the same time I’m fascinated by the ethical and potential legal issues that may come in its wake. Will autonomous cars save us from ourselves?
Andrew Meggison was born in the state of Maine and educated in Massachusetts. Andrew earned a Bachelor’s Degree in Government and International Relations from Clark University and a Master’s Degree in Political Science from Northeastern University. Being an Eagle Scout, Andrew has a passion for all things environmental. In his free time Andrew enjoys writing, exploring the great outdoors, a good film, and a creative cocktail. You can follow Andrew on Twitter @AndrewMeggison