For a few years, I’ve been interested in the formation, sustaining, and analysis of organizational culture. This refers to the ways in which organizations – be they corporations, universities, government agencies, charities, etc – operate. This operation necessarily includes both official guidelines and behaviors, such as training manuals, employee workspaces, emergency plans, as well as unofficial procedures, such as employee communication habits. Sometimes these phenomena are collectively referred to as “corporate culture,” a misleading name that nonetheless captures the significance of both the letter and spirit of the “law” in any organization. At best, a functioning culture can create a positive environment for workers and theoretically lead to better products/events/services. At worst, a non-functioning culture creates both a negative employee environment and can potentially result in disaster. There are two fields which have captured my attention in this regard: the space shuttle program at NASA and nuclear power plants. Both fields emphasize technical expertise and process as keys to the safety of their work, but both fields have also proven to be rife with the kinds of culture that can lead to real tragedy.
I became interested in this after reading Diane Vaughan’s 1996 book The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. It is a fascinating analysis of the problems within NASA’s organizational and technical processes that ultimately led to the tragedy that was the Challenger shuttle accident. Vaughan’s thesis is that the accident was the direct result of many smaller issues that caused the technicians at both NASA and the associated shuttle contractors, most notably Morton-Thiokol, to downplay the potential for errors in the preparation to launch. There are two aspects of her argument that I find particularly fascinating and which hold lessons for other organizations, including academic ones.
Weak Signals: Vaughan describes a “weak signal” as “one conveyed by information that is informal and/or ambiguous, so that its significance – and therefore the threat to flight safety – is not clear” (355). The most infamous weak signal with regard to the Challenger was Roger Boisjoly’s memo to his superiors at Thiokol, almost one year prior to the launch, that the O-rings in the solid rocket boosters (the two white pieces attached to the orange fuel tank) could fail in a cold weather shuttle launch. Because memos in the 1980s NASA-related technical culture were such routine communication, no particular memo stood out from the others, according to Vaughan. Thus, the warning in the memo did not affect the processes in place the way Boisjoly thought it would.
Normalization of Deviance: Vaughan pinpoints another problematic aspect of NASA technical and organizational culture which was paradoxically one of its most celebrated features – its extraordinary achievements and the immense risk attached to them. She argues that NASA had grown so used to the risk involved in shuttle launches that they tended to “normalize” certain behavior and actions as part of an overall acceptance of the risks of their work. In other words, even though each shuttle launch was an extraordinary and incredibly risky achievement, the technical culture surrounding the shuttle program treated the project as much more “normal” than it ever really was. Deviations or abnormalities were alway seen within the context of what the shuttle program had already achieved. In the case of the Challenger tragedy, neither the launch day temperature nor the O-ring wear and tear were automatically flagged as issues which would cancel the shuttle launch, which had already been previously rescheduled. Rather, because the O-ring condition didn’t appear to be worse than other O-rings had for other launches, it was within the realm of acceptable risk. After all if the ring didn’t look any worse for wear compared to other instances of successful launches, then there was no reason to be concerned. This is significant because after the accident, many fingers were understandably pointed at the various actors who either ignored or glossed over these apparent defects and conditions. However, as Vaughan points out: “actions that analysts defined as deviant after the disaster were acceptable and non deviant within the NASA culture” (120). This is perhaps understandable – things within the organization culture clearly become normalized, even if they would appear out of the ordinary to an outsider.
However, within the specific context of the shuttle program, the organizational culture failed to fully respect the extraordinary nature of their work. After all, over the fifteen year period the program existed, only about 150 launches were made. That is far from old hat; imagine how many airplanes fly every day, around the globe – that is, after 100+ years of flight, old hat. In 1986, the Challenger mission was to be just the 25th mission of the entire program (outside of the initial 5 test flights with the Enterprise orbiter). Thus, any deviation from established parameters should not have been normalized.
Lessons – while being in academia may not be the most dangerous or extraordinary organizational culture, I think there are some lessons to be drawn from these events. First, consider the category of weak signals. We may not write and send a lot of memos, but I wager everyone receives a lot of emails all day, every day. Those emails from the Dean or from Academic Affairs? The numerous conference spam emails? Questions from students or colleagues? It can be difficult to sort through and ensure you pay attention to and respond in a timely fashion. Thus, we would do well to consider that emails are really academia’s own form of weak signals – when we send course updates via email to our students, are we sending the strongest signal? Perhaps an in-class announcement or statement uploaded to your course website would be more effective. The same considerations can be applied to other forms of communication – maybe the idea you pitch to the Vice Chancellor at the semester kickoff wine and cheese reception never goes anywhere. Is that because you sent a weak signal? Perhaps they didn’t think you were serious because chitchat is not the same as a formal proposal. Whatever the communication need is, we should all consider the most effective means of transmitting it.
I also think we should consider how our organizational culture might appear to an outsider. Academia is often inscrutable, a series of chutes and ladders that we have all been forced to navigate since graduate school, and which never really came with a clear set of rules. Thus, we accept a lot of risk that in other professions, would seem “deviant.” Such as, things like buying plane tickets to interview at conferences, with no guarantee of further job prospects. We normalize a lot of behavior that is less than professional in other industries. Those long-tenured faculty who never attend meetings? By only moving with the flow and not being reflexive, disaster can result. These are not disasters on the level of NASA failure, but they can negatively affect important things such as student retention, research projects, collaboration amongst colleagues, and the every day work of the university.
In Part II, I will unpack some things I’ve learned about organization culture from the nuclear power accidents at Chernobyl and Fukushima. These lessons are similar to those learned from NASA, but with the added element of regulatory issues that were either ignored or slow-walked, and which likely directly contributed to the accidents.
Note: the problems identified in the Challenger disaster were not entirely eliminated from NASA culture. In fact, they were repeated almost to the letter in the Columbia shuttle disaster of 2003.
Pingback: Organizational Cultures – a Matter of Life and Death Part II | Sharyn Emery
Pingback: SpaceX & the Ghosts of Space Travel’s Past | Sharyn Emery