The Columbia accident served as a reminder that safety in human spaceflight ges beyond not just technical issues but also the culture of an organization, including how it uses language. (credit: NASA)
Next year, if everything goes according to plan, American astronauts will once again ride into space on board American spacecraft. This has been a long time coming, and at least part of the delay has been due to NASA conducting extensive safety reviews of the new spacecraft and space companies learning how to design and operate spacecraft safely. NASA is out of practice with launching rockets with humans aboard, and Commercial Crew is a different way of operations and oversight than NASA has traditionally used. Hopefully those overseeing this return to flight will be looking at past programs for lessons and insights. If they do, they will almost certainly look at the experience with the 2003 Columbia accident.
Columbia still has important things to teach us. To begin with, safety is more than just procedures and rules and checks and backups and fault trees. One of the lessons the Columbia accident taught us is that safety is about culture, the things a group of people do that are not formally written down, but which can nevertheless turn out to be very important. One of those is how people use, and misuse, language when talking about things that are safe and unsafe.
The role of NASA’s culture in the Columbia accident was one of the hardest things for outsiders to understand, and one of the hardest things for those of us involved in writing the accident investigation report to convey. Most people, especially engineers, could grasp the technical issues—the force of the foam that came off the shuttle’s external tank and hit the front of the orbiter’s wing—but culture was abstract and amorphous. What is it? Simply put, “culture” is the unwritten rules and procedures and even traditions of those who worked on the shuttle program, the ways of doing things that were passed on from one person to the next.
One aspect of that culture that those of us who worked for the Columbia Accident Investigation Board learned about was the term “in family.” I still remember, more than 16 years later, when I first heard that confusing phrase mentioned by somebody working on the CAIB. I asked him what it meant. He laughed, and said that was the problem: it was not clear what it meant when NASA employees and contractors used it. Then I got a short lesson.
“In family” was a term frequently used by engineers on the shuttle program. The problem was that it was not really an engineering term. And it didn’t mean the same thing to different people, or even the same thing in different situations, or even the same thing over time.
In some cases, “in family” meant that data, or an observed result or event, was within a predicted range. But for other things, “in family” meant that an observed result or event may have been outside of a predicted range, but was still acceptable from a safety standpoint. The obvious example for Columbia was the issue of insulating foam coming off the external tank. When the shuttle was first built in the 1970s, NASA had a strict rule that a CAIB board member once characterized as “foam shall not come off the tank,” because it posed a safety risk to the orbiter. In other words, foam coming off the tank was “out of family.”
But once the shuttle started flying, foam was observed coming off the tank. Soon, shuttle engineers and managers concluded that foam could damage the orbiter, but the damage would be minor. It would be a maintenance issue, but not a safety issue. Those on the shuttle program thus started to talk about shedded foam as being “in family,” which meant that it was acceptable.
If you start to think about how knowledge gets transferred within an organization, you can begin to see the problem with this. “In family” can mean both “within the predicted range” as well as “outside the predicted range, but still acceptable.” Those are opposites. Now imagine a young engineer who learns that “in family” means the latter definition. He observes an event that is outside the predicted range, but he considers it acceptable and tells his supervisor that it is “in family.” It is possible that he has thus passed on that something is safe when it isn’t.
One of the other uses of “in family” was that an issue was no longer worth paying attention to. It was “in family,” so move on to the next thing instead of asking for more data. Often in review meetings somebody would say that something was “in family” and therefore didn’t require any further discussion. We heard “in family” a lot from NASA people, and I believe what rankled those working on the CAIB wasn’t simply the inaccuracy and inconsistent use of the term, but how casually people used it. There’s an old joke about how some people see a glass as half full and others see it as half empty, but an engineer sees 8 ounces of liquid in a 16-ounce glass. That’s the kind of precision that we expected from NASA, and didn’t always see.
This was an example of what author Diane Vaughan wrote about in her book on the 1986 Challenger accident. Vaughn warned of “normalization of deviance,” whereby events or data outside of accepted values get normalized over time and considered acceptable. For Challenger, it was low temperatures and O-rings. We brought Vaughn into the CAIB first as a guest, then as a consultant. The in family issue was another manifestation of deviance being normalized, although in some ways it was both a symptom of larger problems in the shuttle program, and a cause of sloppy communication.
The whole in family/out of family jargon bothered some of us on the CAIB because it was so sloppy and prone to misuse, and I have a vague recollection of at least one person, possibly even a board member, saying that our report should recommend that NASA ban the phrase. I think this was rejected by the CAIB’s chairman, retired Admiral Hal Gehman himself, who said that it wasn’t our role to tell NASA what language they should use. Instead, we told them to practice engineering rigor.
Wayne Hale, who led the shuttle’s return to flight (RTF) as Launch Integration Program Manager at Kennedy Space Center following the Columbia accident, confirmed to me that this term was rampant and badly used in the shuttle program at that time, and said he wishes that CAIB had recommended banning it. “In practice, Bill Parsons (Space Shuttle Program Manager) and I effectively did ban the use of that term ‘in family’ during RTF for exactly the reasons you cited,” he wrote me. Hale added that he has rarely heard that term regarding human spaceflight in the years since shuttle resumed flying, and then the program ended in 2011.
The reason I mention the “in family” issue is because it illustrates just how complex and even abstract safety issues can be. An organization can have written rules and procedures, and external oversight, and all the formal trappings of a safety system, and yet still be doing something that threatens safety, like using imprecise language and teaching the wrong lessons to the people in that organization. And people can do this and not even realize that they’re impacting safety just by talking.
Hopefully, as NASA prepares to once again launch American astronauts into space, they will be reviewing their procedures, as well as the lessons learned from prior accidents, and trying to contemplate the hidden threats to the astronauts, who are, after all, part of the NASA family.
Note: we are temporarily moderating all comments submitted to deal with a surge in spam.