Tuesday, June 2, 2009

"Are Robots Moral?" by Beth Kosters

Human: (Sitting on bench reading a news device.)

Alien: Did you hear what happened?

Human: What?

Alien: (Points to the device.) What happened to that robot population in sector 4?

Human: Oh! Yes, seems almost unfair.

Alien: Unfair? How? They are robots, machines. Machines we, as the dominant populations, have created.

Human: Well, we have allowed them to become a large part of our society.

Alien: But, you will admit that they are just machines…just parts put together to make something that works.

Human: Yes, of course they are. But they are made to be much more than that. In a way are we not just parts put together?

Alien: Silly human, of course we are parts put together, but we are organic. We are not made of synthetic materials. We are naturally reproduced beings. It is the reason that we aliens decided to live in peace with you when robots started becoming a large part of our world and population.

Human: Are we not the…but…one of my good friends is a robot…if you can consider them friends.

Alien: How can you be friends with a robot?

Human: The same way I can be friends with an Alien. Just because they are made of parts that we put together and can take apart does not mean we should treat them as lesser beings. Robots are programmed to be moral beings and I feel that it is immoral to treat them as anything less.

Alien: Obviously when we came to this planet, we became the more intelligent species. We made things better for you and the robots.

Humans: In return we treated you with kindness. Why did you not destroy us when you had the chance?

Alien: Because you were sentient beings like ourselves. We have our own moral values which we are entitled to follow, even though we are the more intelligent race.

Alien: But how is robot different than any other piece of technology we have?

Human: Have we programmed other pieces of technology to interact with us the way the robots do? We have created robots for work and for socialization.

ENTER ROBOT

Robot: Greetings!

Human: Hi!

Alien: Hello.

Robot: What is occurring here in this small, civil gathering?

Alien: I believe we are about to get into a discussion on morality, would you care to join? Your kind seems to be a central argument.

Robot: It would be an honor to partake in such a discussion. I do love intellectual conversation.

Human: That would be wonderful. Where were we going with this?

Alien: I believe you were beginning to state the moral treatment of different kinds of beings.

Human: Oh, right! I believe in the golden rule. ‘Do unto others as you would have them do unto you.’ The philosopher, Immanuel Kant’s, view of morality is very similar if not based on this rule. He says, ‘I ought never to act except in such a way that I can also will that my maxim should become a universal law.’ This means that if a moral rule is consistent than it should be able to be used by everyone.

Alien: Explain more please?

Human: Of course. There are two major themes when it comes to morality, consistency and impartiality. If a rule is consistent then it should apply to everyone. Impartiality is linked to consistency in the way that a rule should apply to everyone, to no matter what. If this cannot happen then it is not a morally correct rule.

Robots: That is similar to the moral rules we, robots, have programmed into us. It is my understanding that our morals are based on old rules called the Commandments, with alterations of course. Instead of obeying a ‘god’ we are programmed to obey masters or the group that we work for.

Alien: Why are you moralities based on those?

Robot: Because we cannot form and develop our own models of morality based on experience. These rules may have been the oldest rules of behavior, and because they have proven to be timeless, they seemed like the logical way to go. But we are also programmed to obey the governmental laws of the nations we exist in.

Alien: But what if you come upon a situation that causes conflict between two or more of your moral rules? For example, what if your creator told you to kill someone because they stole something?

Robot: Well, we are no the protectors of the law, unless we are programmed into the police force. And because it is against the law to murder, I would have to disobey my master. My master is making a rash decision that would get him into trouble and the action commanded would not benefit the majority of society.

Human: That last part seems to reflect a bit of utilitarianism. This is a belief that happiness is intrinsically good and all moral actions are based on what makes that majority happy.

Alien: You could argue that sometimes actions may not make others as happy as we intended. But what if the thief was a murderer too?

Robot: That information cannot be known at moments like that, only the action at that time. You could say that we reflect utilitarianism, but as you can see with Alien, our actions do not always make others happy.

Alien: Human, if I’m not mistaken your philosopher Kant, would not agree with the fact that morality can not be programmed into someone because of the lack of free will to choose to be moral.

Robot: Allow me to answer this one?

Human: Go right ahead.

Robot: It is true that we do not have, what you call, a free will. Free will seems to be a natural thing that occurs in developmental beings. It is experience that only nature can allow. We are not natural as you have said before. We can only learn so much. Robots are limited to what robot creators limit us to. We are programmed to like and dislike certain things. We can remember facts. We could even have emotions that cause a response or respond to a cause. We have programmed personalities. That is how it is possible for us to function in society, but not have a free will. But if we follow the laws and have logical moralities programmed into us, how does that hinder you? They have not found a way to let the programs to be guided by rules rather than only obeying them.

Human: It is hard to imagine what it would be like to not have free will.

Robot: When you only know what you know, some things are not as bad as they seem. We notice that free will can also cause destruction in this world. We are only what you make us to be or want us to be. But haven’t you programmed the robots to be intelligent also? Aren’t we all programmed with morals?

Human: I guess we are but not the same way. Doing the right thing seems to be a natural as well has developed concept. I don’t believe there is a certain reason why we are moral. The most important thing to know is that we are, it’s what we do. It makes us who we are and helps form us into what we want to be. Some people have religion, philosophers and experience.

Alien: Because you have been created by our kinds we have programmed you with moral values that reflect that of our own.

Robot: Do you mean to say that you have created us to be perfect moral beings?

Alien: I guess we have, if a combined program of moralities can be perfect

Humans: Then, alien, why should we not treat them with the same moral treatment?

Alien: Do you treat your pets the same way you treat your fellow human?

Human: How do you mean? Do you consider robots your pet?

Alien: I consider them machines. Like the appliances you used in your households. They are a service to us.

Robot: Clearly, I am not a microwave. I am a machine, but I am a machine you created all of my thoughts and feelings and inner workings are after your images. I could be a reflection of you and what you want society to be.

Human: This may be more along the lines of speciesism, which may be the conflict in our moral beliefs.

Robot: What is that?

Alien: Speciesism is the belief that another species deserves a lesser treatment because they are a different species. And to comment on your previous statement, in a way humans view animals the same way aliens view the robots.

Alien: We aliens have a moral base like this is as well. We are consistent in the way that we treat every alien with the same consideration and respect. Which is more than you humans can say.

Human: Touché.

Alien: The impartiality in our moral judgment is more like consistency. We believe there is not moral difference without a relevant other difference. Obviously there are relevant differences between natural beings and robots.

Human: I can agree, but please elaborate.

Alien: There are obviously physical differences between aliens and robots. Robots are made of mostly metal parts and Aliens are made of organic parts, as well as humans. As we have discussed before.

Human: But are physical differences really relevant? All humans look different and have different abilities, but (most) are moral creatures. The same goes for the different races of aliens. What if a being from one of our races were brain damaged. This would make a robot more intelligent and more functional than that being. Even if a robots looks and is made of different materials. Am I wrong?

Alien: I guess not. But we do not have mutations that allow brain damage to happen. Human and robot, thank you, for giving me many things to think about and consider with my morality?

Human: It was an intense conversation. It makes me wonder why we do the things we do in our lives.

Robot: It seems with organic sentient beings that there is not real rational reason why you are moral creatures. It is something you do, something that forms who you are and who you inspire to be. We robots do not aspire to be anything but what we are. So maybe you are wrong in calling us moral beings, we are just robots.

No comments:

Post a Comment