The T-shirts Accelerating Robot (TAR) team was another Senior Design Project (SDP) team that we observed in the second year of the study. The TAR team was designing a robot that can launch T-shirts to spectators in the stadium at the sports event. This team had an ethics advising team as a collaborative partner in ethics discussion. The ethics advising team consisted of students who were taking philosophy of science and technology course, and they also received an instruction about how to do ethics advising to help engineering design teams. The TAR team and the ethics advising team discussed various ethics issues directly or indirectly related to the TAR team’s project.
In the selected segment, they discussed the concern of emergency shut-down of the robot. At the ethics advising team’s suggestion, the TAR team agreed to design a physical shut-down device, so the operator or anyone nearby can stop it at the emergency. The team also mentioned that their initial solution would be installing the emergency shut-down software. Here is the discussion segment.
To analyze the discussion, we (researchers) identified a few types of meaningful keywords, and highlighted them in different color. This conversation was focused on the safety issues, so we first marked keywords representing safety in green. Then we marked user related keywords in purple. Then we marked keywords about engineers’ actions in blue. The below is color-coded conversation with index.
In this discussion segment, lots of non-verbal expression such as laugh, gestures, and sound. So we marked them in bold. See the below.
Now, we studied relationship among keywords, interpretive meanings in the discussion, team members’ gestures or any non-verbal expressions such as laugh, the team’s particular habit or way of talking, and any other noticeable clue. First, we noticed that the safety concern, users’ perspective, and engineers’ perspective were continuously discussed through the discussion segment. All of the color-coded keywords were appeared from the beginning to the end of the segment, indicating the discussion was continuously developed. Second, we found an important point regarding TAR team’s dealing with ethics issues in this discussion. Read carefully the following conversation again.
Advisor: Ummm and besides the operator, I know there are only two operators, but will there be anyone actually physically able to like stop it if something does go wrong with it?
TAR Team: What we might do also, is you know, if we talked about the emergency stop earlier, and I think you know, good practice is,… what you usually do is, you have, a… software…but then also you have a physical switch all over the robot which you can run up and you know, pull the lever and it’ll shut off, so we’ll make sure to have one of those on there too.
As seen in this conversation segment, although the TAR team thought of the software-based solution, they accepted the ethics advising team’s suggestion and revised their design by adding a physical back-up plan. It indicated that the TAR team had a different cultural model from the other SDP teams in regard of the relationship between the safety concerns and the users. Unlike the other SDP teams, the TAR team included a user’s role in the emergency stop process. It indicated that the TAR team thought that a safe design could include not only a design product but also an active role of the users. The TAR team’s cultural model was shown below.