Today we were presented with an scenario and given a personal narrative as a programmer hired to design a self-driving car.
The scenario: Our company's self driving car is driving on the highway behind a semitruck carrying a strapped load, and it's boxed in on either sides by an SUV and a motorcyclist. Suddenly, large boxes unexpectedly fall of the loaded truck, in the trajectory of our self-driving car. In this moment, our car cannot stop in time to avoid the boxes, so we are presented with three choices.
- Go straight into the boxes
- Swerve towards the SUV
- Swerve towards the motorcyclist
My Choice: Why I Would Program the Car to Drive Straight
Driving a self-driving car ultimately comes with its own risks, one of them potentially being that at times, the computer will make the decision for you. In this scenario, I believe that the most ethical option is taking the hit for yourself without purposefully deciding to involve other vehicles on the road. Specifically with this solution, you are not putting other people's lives at risk with your decision to drive a self-driving car.
What Does This Mean for My Company as a Programmer?
Self-Driving cars and its predetermined decisions especially disturb the idea of "fault" within accidents. Specifically, if our self driving car were to crash into another vehicle, who would be at fault? To avoid this situation, and avoiding the potential injuries of others, I would decide to solely put the risk on the occupant of our vehicle. Although this solution may seem less appealing to potential customers, it's more beneficial in the long run in avoiding conflict with other drivers, so long as the potential customer signs a consent form in accepting the risks that comes with driving a self-driving car. With this in mind, while we may lose sales for not prioritizing the occupant of the self-driving car, we ultimately save energy and money in avoiding lawsuits from non-customers.
Discussion Comments
Today we briefly discussed the similarity of this dilemma to the ethics of the well-known "Trolley Problem."
The Problem usually goes as this:
'Avoid personal harm with the cost of multiple lives or save multiple lives at the cost of a loved one's life' or in this case, potentially our driver's life.
While one may think, "the Trolley Problem would not happen in real life!" today's discussion provided a real life example where the same ethics apply. It's simply unrealistic to dismiss morality and ethical discussions, and this is where the importance of taking STEM in Society at GSSE really shines through. This is just the beginning of our ethical discussions in STEM in Society, so I'm looking forward for the rest of the month!
Comments
Post a Comment