Although it feels like our cars are thinking on their own, they aren’t completely there just yet. But they will be, and you probably already know that fully self-driving cars are being tested. Some cars, like Teslas, actually have self-driving features, but even the manufacturer warns that their cars are not prepared to fully drive themselves everywhere, without human intervention.
But the day is coming when cars will be able to fully drive themselves. And when they do, those cars will be making decisions about both the safety of people outside the car and people inside the vehicle.
The Moral and Ethical Dilemmas
The typical moral dilemma is this: Faced with the possibility of hitting people, or a child, on the street, does the car hit the person — or does the car swerve to avoid the person, potentially going off of the road, and seriously injuring the people inside the vehicle?
While more advanced cars will have the technology to try to minimize this from ever happening, that won’t bring these scenarios to extinction. That means that somewhere along the line, a vehicle’s programmer needs to tell the vehicle what it should do in these kinds of situations.
The ethical questions can get pretty complex. For example, what if the car encounters an animal? Does it swerve for any animal? The car would have to evaluate how large the animal is — it may be safer to swerve off the road, rather than hit, say, a large moose.
So, the car would be programmed, if needed, to only hit smaller animals. Does it matter which one? Should it swerve for a dog but not for a raccoon?
These are ethical questions that are no longer just hypothetical, but they illustrate the difficulties with self-driving vehicles.
We Don’t Want to Give Up Control
And while most of us wish that the days of self-driving cars were here when asked, most people want control in these situations — they don’t want these decisions being made by a computer (even though it can be argued that the car’s computers could react quicker than human reflexes can, and thus, maximize the chance of a more positive outcome).
In other words — most people would want the car to do whatever hurts the least amount of people, but they don’t want to leave that decision up to the car itself.
Is Immunity Coming?
We may reach an age, when cars are truly autonomous, and you cannot sue a car manufacturer if the car makes a “decision” in one of these scenarios. Manufacturers will likely seek immunity if the car is programmed to make one decision over another.
Until then, manufacturers and consumers have some tough ethical questions to answer, before cars become truly autonomous.