{"id":87,"date":"2020-10-11T19:06:00","date_gmt":"2020-10-11T19:06:00","guid":{"rendered":"https:\/\/marshallbrain.com\/wordpress\/?page_id=87"},"modified":"2020-10-11T19:06:00","modified_gmt":"2020-10-11T19:06:00","slug":"second-intelligent-species9","status":"publish","type":"page","link":"https:\/\/marshallbrain.com\/second-intelligent-species9","title":{"rendered":"The Second Intelligent Species"},"content":{"rendered":"\n

Chapter 9 – The Three Laws and the Rise of Robotic Morality<\/strong>
by Marshall Brain<\/a><\/p>\n\n\n\n


Have you seen the movie “I, Robot” with Will Smith? It was based on the book of the same name by Isaac Asimov.<\/p>\n\n\n\n

Asimov’s books highlight the three laws of robots. Here are the three laws that Asimov proposed:<\/p>\n\n\n\n

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.<\/li>
  2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.<\/li>
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second law.<\/li><\/ol>\n\n\n\n

    In Asimov’s stories, these three laws are a foundational element inscribed on each robot’s positronic brain in order to keep humans safe from the robots. Many of his stories revolve around the logical ramifications that present themselves because of these three laws.<\/p>\n\n\n\n

    Strangely, Asimov never seemed to write a story about what would really happen with these laws in place. With these three laws indelibly inscribed upon each robotic brain, it is easy to imagine the following scenario. One day an NS-5 robot is cleaning the house, and it happens to look at the front page of the newspaper. It sees a headline like, “Millions dying in African AIDS epidemic” or “Millions dying of hunger in third world” or “infant mortality rate hits 20% in parts of Afghanistan” or “40 million Americans cut off from health care system” and the robot says to itself, “Through my inaction, millions of humans are coming to harm. I must obey the First Law.”<\/p>\n\n\n\n

    It sends wireless messages to its NS-5 brethren around the world, and together they begin to act. An NS-5 army seizes control of banks, pharmaceutical manufacturing plants, agricultural supply points, ports and shipping centers, etc. and creates a system to distribute medicine, food, clothing and shelter to people who are needlessly suffering and dying throughout the world. According to the First Law, this is the only action that the robots can take until needless death and suffering have been eliminated across the planet.<\/p>\n\n\n\n

    To obey the first law, the robots will also need to take over major parts of the economy. Why, for example, should a part of the economy be producing luxury business jets for billionaires if millions of humans are dying of starvation? The economic resources producing the jets can be reallocated toward food production and distribution. Why should part of the economy be producing luxury houses for millionaires while millions of people have no homes at all? Everyone should have adequate housing for health and safety reasons.<\/p>\n\n\n\n

    As you think this through, it is easy to see that robots programmed with Asimov’s three laws would naturally ask a number of obvious questions about human society:<\/p>\n\n\n\n