Accounting, O-rings, and the Dangers of Extrapolation

I’m currently “knees-deep” (“elbows-deep”?) in a Managerial Accounting course that is part of University of Nevada’s outstanding Executive MBA program, and my cohort is studying Cost Behavior.  Part of the discussion focuses on what is called “Relevant Range”, which is defined as “the range of activity over which we expect our assumptions about cost behavior to hold true.  The topic reminded me of a very important danger in statistical analysis that should always be on our minds when conducting statistical analysis: extrapolation.

Space Shuttle Challenger

Space Shuttle Challenger

One of the reasons the dangers of extrapolation have always stood out to me is because I had the misfortune of witnessing its effects first-hand, at a young age.  It occurred while I was a student in the seventh grade at a school in central Florida.  It was January 28, 1986 and the students (myself included) were all on our lunch break, and it just so happened that the Space Shuttle Challenger was scheduled to launch right in the middle of our lunch break.  As we all watched the shuttle’s familiar arc through the Florida sky, we soon realized that that launch didn’t look like the other ones.  If you are familiar with space exploration’s history then you know that this was Challenger’s last launch; it exploded 73 seconds into the flight, killing all seven astronauts onboard.

But how does this tragedy relate to statistical analysis and extrapolation?

The temperature at Cape Canaveral (where the launch took place) was an unseasonably low 31F at launch time.  The O-rings that prevented fuel leaks in the rocket boosters were made of a material known to weaken in cold weather.  NASA had to make a decision as to whether or not to launch, and the decision-makers had to balance a lot of factors.

Hindsight is always 20/20, and a lot of analysis of the process that led up to the decision to launch has led to a lot of causes: communication failures, “groupthink”, and even blaming poor presentation graphics.

But one of the biggest lessons for CPI practitioners to learn is about extrapolation, and here’s why: NASA had been measuring O-ring performance from prior launches, but all measurements were taken in temperature ranges from 53F to 81F.  Not only was 31F below the range measured previously; it was WAY beyond the range that had been observed and analyzed.  By assuming that the existing measurements (and the model built from them) would extend past the range of observed occurrences, NASA left the realm of statistics and embarked on a journey into probabilities…  And if you enjoy playing roulette as much as I do, you’ll know that probability and statistical analysis are not the same thing.

How about you?  Have you ever experienced the negative consequences of extrapolation when analyzing processes?  Join the conversation and share your experiences!

Leave a Reply