Monday, June 1, 2009

"Flaw and Order (a modern-day Socratic Dialogue)," by Martin Beeler

May 12, 2039- In the case of Thomas v. California, the jurors are in their final deliberation. The entire country is anxiously awaiting the outcome of this most unusual case. Never before has a citizen been charged with manslaughter because of his robot. These robots, with breakthrough AI technology, have been developed in the last few years by various research organizations. These robots can interact with their environment and make decisions based off of the data. These characteristics are not new developments; robots with these capabilities have been around for decades. But with groundbreaking new technology in the fields of nanotechnology and brain mapping, these robots can make decisions thought to be only capable by humans. They ‘learn’ as humans do, picking up on language and even ideas. Saturn Lab, who Dr. Thomas was employed by, created the most advance robot yet. They have not released much information about exactly how it works, which has been a very frustrating part of the trial. This is the first tragedy to have come from these newest-generation robots. Dr. Thomas was home along with the robot, a type 2 BGBR( Big Bear) prototype. That evening, while Mr. Thomas was on the phone in his room, a man came into his house. The man was Tiberius Jones, a fellow AI researcher. Dr. Jones was well acquainted with Dr. Thomas. They had worked together on a previous project, and Dr. Jones was also on the brink of creating his own robot. His colleagues reported that he was upset at Dr. Thomas for not helping him in his research, despite having asked him several times. Dr. Thomas said he did not want to release any information on certain technological features until the robot was officially completed. After Dr. Jones came into the house, Dr. Thomas heard him talking loudly to the robot, and then a scuffle. By the time he walked in the room, he said Dr. Jones was unconsciousness on the floor. He was rushed to the hospital and reported dead an hour later. The robot had apparently strangled Dr. Jones. The robot was then taken by the police and handed over to Saturn Labs. They then destroyed the robot, claiming it was a danger and the next prototype would need some changes. Dr. Thomas was arrested and put on trial four months later. There was a lot of debate whether it should proceed as a regular trial, but the judge insisted this case was taken care of no differently than any others. Despite the very unique circumstances of this case, the trial went on as normal. Evidence was given by dozens of scientists on both sides, and now the only thing left is the verdict. The judge has decided that this is the last day for the jury to decide Mr. Thomas’ fate.

Jury recordings, 5-12-39

Head Juror- Okay, here we go. Day 23 of the trial. I hope everybody is well rested. Judge Sanders has said we need to have a verdict by today. I want to stress once again how important this case is. It may set precedent for all other cases like this in the future, not to mention how the scientific community and entire country views this new robotic technol-

Juror #4- All right, we’ve heard this shpiel more times than I can count.

Head Juror- Alright, well let’s start it off with some further discussion on specific topics, starting with Mr. Thomas. I think we are in agreement that he had no bad intentions?

Juror #2- Well of course he didn’t mean for the robot to kill the guy, but it was still careless neglect on his part to leave the robot unattended.

Juror #9- But Mr. Jones stormed into Thomas’ house, really upset at him. And the robot might have just been protecting Thomas.

Juror #5- But I’m still not entirely convinced that it was so accidental. When the investigators checked out Dr. Jones’ lab, they noticed he was close in creating the same technology, maybe a couple more months and would be able to start creating a similar robot. He was mad at Mr. Thomas in not assisting him with his research. I still think Thomas was jealously guarding his technology.

Juror #6- Oh, if only the robot wasn’t destroyed, then we could ask it what happened.

Juror #2- Like we’ve said before, there is no point in bringing that up. We know it was bad for them to destroy the robot, but what’s happened happened; no need to dwell on it.

Juror #8- But Thomas even admitted the robot was not completely finished. He still had some quirks to work out. In that case, it does sound very irresponsible to me.

Head Juror- So we can agree that it was unintentional but irresponsible, yes? But specifically, should Mr. Thomas have taken the robot out of the research facility to begin with?

Juror #7- It was clearly in the research facilities’ guidelines to take the robot out and acclimate it to new environments, albeit under proper supervision. Thomas’ team allowed him to take control of the project and supervise the robot. It was in the final stages of development.

Juror #6- I would not call it in the ‘final stages’, it is like a human, constantly learning. Compared in human terms, it was like a child. It was just starting to adjust to the real world. And like a child, it is not really responsible for its actions. Those responsibilities are usually entrusted to the parent. In this case, Mr. Thomas.

Juror #8- I don’t think we should give it that analogy so quickly.

Juror #6- But if this new technology really works like a human brain, then maybe we should look at it like that. It is learning, like a child. And if your child kills somebody on your property, you will be held responsible.

Juror #8- Perhaps, but this child was manufactured by Mr. Thomas, who obviously did not take the steps to ensure it was not a danger to other people.

Juror #9- He may have not taken certain steps to ensure complete safety, but we never know what will happen until it becomes reality. Sometimes it takes a terrible accident like this to fix a major problem.

Juror #2- No, this accident did not need to happen. I think you are looking at this way too lightly. With this great technology comes great responsibility.

Juror #6- But what would he had to have done to ensure safety with it? Not given it the ability to move? As his research partner stated, it would have been difficult if not impossible to have done that. The robot’s brain is all interconnected. In order to for its brain to work, things needed to work together.

Juror #3- But wasn’t there a sleep mode on the robot during power boost?

Head Juror- Well, it says so here in the minutes, on page 33, that “We discovered the robot was not able to be in sleep mode for more than six hours every two days. It needed to stay interacted with its environment, or it would lose considerable information it had recently gathered. The computer was subsequently programmed to ensure this would not happen.” So it sounds like Mr. Thomas did not have the option to completely control this robot.

Juror #5- I still think Thomas taught the robot to defend itself against an attack, including attacks against him.

Juror #4- I would not totally reject that. I mean, one of Saturn Lab’s major interests in this project was the Defense Department. Maybe Dr. Thomas was a little eager for a contract and wanted to start training the robot to impress them.

Juror #7- I think the robot was a little too quick to kill, more than Thomas would ever have thought.

Juror #6- Yes, well the robot will think on its own terms. This robot was unique. I don’t know how much influence Dr. Thomas could have had on it in so short of time.

Juror #5- So short? The robot was in his possession for seven weeks! I think that is plenty of time for it to have become trained. He spent hours every day training it and working on the programming.

Juror #6- Do you think that is enough time to instruct it on basic human functions and language while at the same time teaching the thing ideologies and defense tactics? The way I see it, it was a bad accident and in no way could Dr. Thomas have seen this coming.

Juror #9- Mr. Thomas may not have realized what he was teaching it. The robot may have picked up on the idea of protecting Dr. Thomas through its extensive browsing through the many learning programs the team designed for it.

Juror #2- But Thomas was carefully regulating these programs. He does not see how it could have picked up on this idea of violence.

Juror #8- Well how do you explain the fact that it strangled Dr. Jones? After the coroner examined him he concluded it would have taken a lot of deliberate strength for the robot to have killed him.

Juror # 7- Yes, but nonetheless these things are too early in development to realize how exactly they work and what they are capable of. There is a big responsibility, but I think this as just too unexpected to be prepared for.

Juror #9- Don’t you guys think it is punishment enough that the robot was destroyed? If it was a threat, then the threat is no more.

Head Juror- We are not debating whether punishment was already dealt with or not. The judge has stressed enough that we are only to decide whether Mr. Thomas is guilty or not guilty of manslaughter. We already lost two jurors due to the fact that they could not deal with that very question. I urge everyone to remember that. Now I think it is time for a ten-minute break. We will continue right after.

Head Juror- Ok, let’s continue. Any topics you want to discuss now?

Juror #3- Yes, I think we should talk about the human qualities of the robot some more. I think we could all use a little more understanding on it before we make any more decisions.

Juror #8- Like I said before, I don’t think we should look at this thing as human at all. Just because it makes decisions similar to that of a human, it is purely a machine. As that, it should be treated as one. Machine problems need to be dealt with by the person responsible for the machine.

Juror #6- Why can’t we? It was created to think like a human, more than any machine ever has before. It makes decisions on its own. This thing is as unpredictable as any human. We saw how similar the brain of the robot worked compared to a human’s.

Juror #4- Yea, remember that video we saw with the robot interacting with Thomas? It sounded fairly human to me. It was dealing with pretty complex problems. Remember what they were talking about at the end of that video?

Juror #2- They were discussing the concept of morality. The robot seemed to be a little confused at the idea. Then again, he didn’t altogether understand the concept of violence and punishment.

Juror #5- Well if it was taught these ideas, it would have known not to kill somebody.

Juror #6- Have you ever done something you weren’t supposed to do? You never disobeyed your parents?

Juror #8- I just cannot see how a human engineered brain can make moral decisions, and deal out justice! This is starting to sound like some science fiction story about a robot cop or something!

Juror #7- Yea, well it isn’t science fiction. This is the real thing. This technology is real. We need to realize that there might be a fine line between human and machine.

Juror #4- You’re starting to sound like Descartes or something. Fine line between man and machine? Humans are not machines just as machines are not humans! Are we going to start putting animals on trial as well for crimes?
Head Juror- It is that very line that we are trying to understand here. How sentient was the robot? It was able to walk down the street have a conversation with you. It sounds very convincing that he was in fact an engineered human, as difficult as it may seem to grasp.

Juror #4- Did you just call the robot a he?

Head Juror- Huh, I guess I did. Well the robot did have very man-like features. The voice, the body frame…

Juror #9- But the research team said it was gender-neutral.

Juror #4- Of course it was gender neutral. It wasn’t human!

Juror #6- But Mr. Thomas did tell his team that it was starting to display more masculine signs than feminine, if that makes sense.

Juror #2- Maybe this robot’s brain was too human-like to stay gender neutral. Maybe our brains do not allow us to stay neutral. What if it was displaying more masculine signs not because it had to choose one?

Juror #7- Or because it was around Thomas so much that it was just imitating him.

Juror #5- If that is the case; maybe Thomas did in a way train it to become violent.

Juror #3- So you are saying just being a man automatically makes you violent?

Head Juror- Ok, let’s not get too heated in here. #7, you think this robot was the real deal; truly sentient?

Juror #7- Yea, I do. It’s just too complex to not be seen as sentient. It can walk like a human, talk like a human, make decisions like a human.

Juror #9- So you’re saying it was completely responsible, right? Dr. Thomas should not be blamed for this?

Juror #7- I think it’s starting to look that way for me.

Head Juror- What about you, #3?

Juror #8- I still think Dr. Thomas did not take necessary precautions. It wasn’t just foolish; it was deadly. That said, the robot attempted to mimic human behavior and decision making, but it is still just a complicated machine.

Juror #2- But aren’t we all just mimicking human behavior anyways? Can we really define exactly what a human is? We all act differently in one way or another, but we are all still human. I agree with #7. It was just too human. Sure, a robot might not be naturally made, but why does that matter?

Juror #8- What makes it not human? I don’t even know how we are debating this. It wasn’t born, it was created in a research facility. It is not biologically based.

Juror #6- Yea, but that new nanotechnology the scientist talked about sounded almost organic. The molecules to make the brain multiplied in order to create it. Haven’t you heard about the new medical research going on? Scientists are starting to kill cancer cells through the use of nanotechnology. They are also beginning to grow new body parts like this. If you had a failing kidney or something and this nanotechnology helped you change the structure of the kidney and make it healthy again would you not accept your kidney because it was artificially healed.

Juror #8- Yea, but that is different. You’re talking about kidneys and stuff. I’m talking about the brain. Your brain is what makes you human.

Juror #6- What is the difference between your brain and any other organ? They are all part of your body. If one is artificially created, why can’t the other?

Juror #8- But the brain is like the seat of your soul-

Juror #4- Here we go with Descartes again. That’s worse than basing all your psychology knowledge from Freud.

Juror #7- What is a soul? If it is the determiner for morals and other distinctly human ideas, then why do we still have violence and murder? What’s the difference between a human murdering somebody and a robot?

Juror #3- I don’t think we can decide on this until we all come to some consensus as to the difference between a human and robot. If we can’t then I don’t think there is any use in continuing.
Head Juror- Well how about we vote on it? First off, let’s have a raise of hands to who thinks Dr. Thomas is guilty.

Jurors 4, 5 and 8 raise their hand.

Head Juror- Ok, so who believes he is not guilty?

Jurors 6 and 9 raise their hands.

Head Juror- What about the rest of you? Do you agree with #3?

Juror #7- I think we are too divided into what the robot even was. We cannot continue like this.

Head Juror- I agree with you two. I will tell the judge. We cannot come to a decision on Dr. Thomas’ verdict since we have not decided on the humanity, or if you will, the sentient nature of the robot.

End of recording, 5-12-39

Verdict- Hung Jury

No comments:

Post a Comment