Look at All the Data and Be Wary of Unjustified Confidence

Interesting interview with Richard Feynman about the NASA’s space shuttle Challenge disaster. He discusses very well the problem of not thinking of all of the data and how systems produce results with variation.

“Results” are not enough to judge whether the current process is wise. He describes a child running into the street without looking that is warned by his parent and counters with the evidence that nothing happened. A child repeating this several times can think they have evidence it is not unsafe but that isn’t so.

With the Challenger disaster a simple view of the data analysis problem was a failure to look at all the data – failure to look systemically. Instead they looked at just the data points where problems were seen and those problems all were not catastrophic. If, you looked at all the data, it was pretty obvious cold weather greatly increased problems and if you listened to the engineers those problems were very serious and risked catastrophic results.


In a similar way it isn’t safe to run into the street without looking but there are some cases where this is far riskier – busy streets, streets where motorist are going fast, streets where motorists can’t see you run into the street very well. You probably don’t want to engage in the risky behavior of running into the street without looking no matter what, but you really shouldn’t do it if the conditions are bad.

He doesn’t talk about looking at all the data and paying attention to indications of things getting worse (even if not quite catastrophic yet) much in this interview but did in his written report. He does cover the other major problem of managers wanting to proceed and allowing good results (not having been run over the other times they ran into the road without looking) to convince them the engineers (parent – in the kids running into the street case) were just overly worried and as “brave” as they were.

Optimism can be helpful or harmful. When optimism is used as social pressure to ignoring evidence of challenge that is often bad. It is fine to have everyone understand the data and make a decision the risk is worth taking. Often, as in NASA’s case, optimism (and a desire to please those higher in the hierarchy) is used without understanding the system and the risks.

In my experience when people say “don’t be negative” it is most often a sign of trying to suppress evidence (The Problem is Likely Not the Person Pointing Out The Problem). If it is a case where the risks and concerns are understood and it is just a matter of judgement about whether the risk is worth the potential reward I don’t mind. But often that phrase is used to suppress even raising issues in the same way that NASA’s management system did and that lead to the catastrophe.

Related: 5s at NASAAccept Taking Risks, Don’t Blithely Accept Failure ThoughRichard Feynman Explains the PDSA CycleScience and the Excitement, the Mystery and the Awe of a Flower

This entry was posted in Management. Bookmark the permalink.