If, as the cynic writes, “The only thing we learn from history is that we never learn from history” ... where does that leave the lessons of the Apollo 1, Challenger and Columbia space catastrophes?
NASA is observing a "Day of Remembrance" on Thursday to honor the astronauts lost during those three events, which all took place in late January and early February. But with the space shuttle program winding down, is there anything to be learned from that trio of tragedies?
In engineering terms, the three disasters taught lessons that space workers already knew but had forgotten — or at least had not thought to be important enough to sway operational choices. And all three disasters share a common root cause. That's the big reason why I resist calling these events "accidents": There was nothing random about them; rather, they were consequences of specific choices.
The Apollo fire, on Jan. 27, 1967, was made possible by a decision to believe that flammability in pressurized pure oxygen couldn’t be that bad, even if no tests had been run to check out that convenient assumption.
The loss of Challenger, on Jan. 28, 1986, was made possible by the decision that even though flexible O-ring seals had never been verified to function properly at sub-freezing temperatures, it was convenient to assume they would still do so in the absence of tests proving otherwise.
The loss of Columbia, on Feb. 1, 2003, was made possible by the decision — set down in writing — that even though observers suspected that tank debris might have hit the panel-covered leading edge of the shuttle's left wing, those panels were "probably" just as tough as other tile-covered areas that were previously hit. As was the case for the earlier tragedies, this was a claim that had never been tested.
Appallingly, in hindsight, when tests were made, they showed hazards that should never have been "assumed away." Pure oxygen fires were horrifically violent. Flexible pressure seals at low temperatures did not seat properly in their slots when hit by rocket thrust. And a leading-edge panel hit by a flying piece of insulation didn’t just get scratched — it shattered.
Of course, these three gross misjudgments weren’t the only times that space workers had made convenient assumptions. In other cases, the guesses turned out to be correct, or the feared scenario didn’t occur, or the mission was just lucky. But success at “dodging bullets” isn’t evidence that one is bulletproof, and such bad choices have a way of catching up with the chance-takers. Spaceflight is a particularly unforgiving environment for self-serving pretense and convenient assumptions, as these disasters (and others that fortunately didn’t involve the loss of life) keep reminding us.
The attitude toward safety that works, and that space workers knew all along was correct, is one of relentless verification of all assumptions. When in place, it makes spaceflight, in the words of former NASA Administrator Mike Griffin, "barely possible" at a minimum level of risk (but never at a totally safe level).
Two excruciating problems on recent shuttle missions testify to the current presence of this proper attitude in action. The shuttle Atlantis' launch in 2008 was delayed for weeks by issues with a fuel tank quantity gauge. And NASA launch workers are currently dealing with a months-long delay for Discovery's final scheduled mission, due to problems with cracked spars on the fuel tank. Both problems were baffling and frustrating, but in both cases the shuttle team was committed to determining "root causes" rather than conveniently assuming that the missions could proceed. This took time, and testing, and patience — and it worked.
For the spars on Discovery, it turned out that an insidious conspiracy of a variation in one metal fabrication batch and of unusual assembly stresses led to the failures. Engineers determined those causes by reproducing the problems in laboratory tests. One shuttle had already launched with spars that probably had been cracked, but that had gone undetected — and fortunately, the spars still performed adequately. But a simple and cheap repair on all future spars from that batch has now been implemented.
Have lessons sunk in too late?
Now, the scary part is that the country at last has a human spaceflight operations team that fully “gets it” with regard to the only known approach that successfully minimizes risk ... and by the end of this year, that team will be dissolved. New teams, on new human space vehicle programs, will be pressing forward. These teams — at SpaceX, Boeing, United Launch Alliance, Orbital Sciences, Sierra Nevada and other spaceship companies — include veterans as well as workers who have had relatively little experience with NASA's culture of spaceflight.
They will undoubtedly be faced with the same attractive temptations of "convenient assumptions" as they encounter difficulties and puzzles in the development process. What are the chances that from the start they will consistently make the smart choices rather than the easy ones, considering how hard it was for the NASA team to come around?
How will they resist the temptation to rely exclusively on their intuitions, their judgments, their guesses, even in the absence of double-checking with Mother Nature? As the bumper sticker tells us, "Man forgives, God forgives, nature never" — to which must be added, “outer space never, ever."
Here is where these current anniversary commemorations and recommitments take on far more than purely historical significance. A hideous price has been extracted. It has fallen heavily on those directly involved and on their families, to be sure, and also on all who strive to open the space frontier, and on the country’s future. But the payment of that price can go toward an awesome purchase, if future space workers know how to complete the transaction.
What's important for the next generation in spaceflight is not just knowledge — a cookbook of do’s and don’ts that must be followed by rote — but wisdom, which is the ability to make good choices.
The facts of safety are widely known, but it can be difficult to have cold, hard facts drive difficult decisions on spacecraft design, on operational concepts and ultimately spaceflight itself. We think we "know" some things about safety that in our hearts we don’t fully "believe" and act on. And after a disaster, we may realize that we did know better, but still made the wrong choices.
The memories of the men and women who died due to past misjudgments help us bridge the gap between "knowing" and ‘believing’ how to avoid poor choices. If that linkage between future choice-makers and past victims is missing, the list of casualties can only grow longer.