Skip to main content

Day 1: Self Driving Car Dilemma

Today we were presented with an scenario and given a personal narrative as a programmer hired to design a self-driving car. 

The scenario: Our company's self driving car is driving on the highway behind a semitruck carrying a strapped load, and it's boxed in on either sides by an SUV and a motorcyclist. Suddenly, large boxes unexpectedly fall of the loaded truck, in the trajectory of our self-driving car. In this moment, our car cannot stop in time to avoid the boxes, so we are presented with three choices

  1. Go straight into the boxes
  2. Swerve towards the SUV
  3. Swerve towards the motorcyclist

My Choice: Why I Would Program the Car to Drive Straight

    Driving a self-driving car ultimately comes with its own risks, one of them potentially being that at times, the computer will make the decision for you. In this scenario, I believe that the most ethical option is taking the hit for yourself without purposefully deciding to involve other vehicles on the road. Specifically with this solution, you are not putting other people's lives at risk with your decision to drive a self-driving car. 

What Does This Mean for My Company as a Programmer?

    Self-Driving cars and its predetermined decisions especially disturb the idea of "fault" within accidents. Specifically, if our self driving car were to crash into another vehicle, who would be at fault? To avoid this situation, and avoiding the potential injuries of others, I would decide to solely put the risk on the occupant of our vehicle.  Although this solution may seem less appealing to potential customers, it's more beneficial in the long run in avoiding conflict with other drivers, so long as the potential customer signs a consent form in accepting the risks that comes with driving a self-driving car. With this in mind, while we may lose sales for not prioritizing the occupant of the self-driving car, we ultimately save energy and money in avoiding lawsuits from non-customers. 

Discussion Comments

    Today we briefly discussed the similarity of this dilemma to the ethics of the well-known "Trolley Problem." 

The Problem usually goes as this:

'Avoid personal harm with the cost of multiple lives or save multiple lives at the cost of a loved one's life' or in this case, potentially our driver's life.

While one may think, "the Trolley Problem would not happen in real life!" today's discussion provided a real life example where the same ethics apply. It's simply unrealistic to dismiss morality and ethical discussions, and this is where the importance of taking STEM in Society at GSSE really shines through. This is just the beginning of our ethical discussions in STEM in Society, so I'm looking forward for the rest of the month!

Comments

Popular posts from this blog

QR Code

  QR Code to Katie's Blog

Day 7: Are Animals People?

If you do not meet all the criteria for personhood, does that mean you are not a person?   A person could be capable of a certain criteria more than another criteria and still be a person. If a person lacks a certain criteria like morality, they could still be highly sentient or highly reasonable. I think 'matching the criteria' is a spectrum. For example, a being that we consider would normally consider capable of personhood (such as a human) can be born uncapable or weaker in a certain criteria, but that does not inherently make them not a person. The idea of these criteria is a blanket term, and the term 'personhood' itself is broad and nonrestrictive. 

Day 11: Dr. Schuman

  Guest Speaker: Dr. Schuman Today we got firsthand experience in creating data for a machine learning activity. We visited a site where we were able to feed information through series of pictures of either a facial expression, a hand signal, or certain objects. With this information, our machine was able to learn what traits were associated with certain items through an accumulation of mathematical formulas. Additionally we also got to play with an app that learned from other people's drawings and was able to complete our drawings and portray certain prompts (such as dog, cat, mermaid, lighthouse etc).  Overall it was an engaging experience that allowed us to learn about machine learning data and experience collecting data ourselves.  Pictured: Training Data of My Classmate