THE SIGNAL AND THE NOISE, WHY SO MANY PREDICTIONS FAIL – BUT SOME DON’T 

PIN

This is something of a book report, or maybe a book recommendation.  The book, by the same name as this article, is by Nate Silver.  I didn’t think Silver, or anyone else in the world, for that matter, had any place writing this book.  And I don’t say that because Silver didn’t do a good job.  He did a GREAT job!  The book discusses predictions for everything from earthquakes to NBA games to poker to disease.  I have no idea how Silver was able to get his arms around all of these matters, but he did, and he did it well. 

One key take-away from the book is that all predictions are better represented as probabilities, with probability-density functions.  There isn’t really a 15% chance of rain tomorrow, that’s just some number that comfortably represents the probability-density function of the predicted precipitation.  And, as it turns out, some (many?) weather forecasters usually over-predict the chance of summer showers, abandoning their science.  Why?  Well, they would rather you end up having a sunny day for your picnic rather than getting rained out and getting upset with the weather forecaster and their lousy prediction.  Silver wrote that Pierre-Simon Laplace (of the LaPlace Transform fame) saw probability as a “waypoint between ignorance and knowledge”.  Interesting concept. 

Silver quotes Nobel prize winning economist Thomas Schelling as saying:

“There is a tendency in our planning to confuse the unfamiliar with the improbable.  The contingency we have not considered seriously looks strange; what is strange is thought improbable; what is improbable need not be considered seriously”. 

That is a very bad place to be for us engineers. 

After quoting Donald Rumsfeld about his “known knowns, known unknown and unknown unknowns”, Silver discusses the 911 Commission and their findings.  Based on a discussion with Rumsfeld, Silver writes that there are four types of systemic failures that contributed to the dismissal of the very clear signals of the pending 911 attack.  The failures were of policy, capabilities and management, but most importantly, a failure of imagination.  As Silver says, “the signals just weren’t consistent with our familiar hypotheses about how terrorists behave”. 

This all comes back home to engineering, and especially to the design, operation and monitoring of tailings facilities.  We can’t be trapped by our limited imaginations or dismiss the improbable when we are assessing tailings facilities and their potential to fail.

Every time I’ve tried to guess what Mother Nature would do, she made a fool out of me.  And She didn’t even care.  Fortunately my errors were pretty minor. 

Think of Murphy’s Law.  When Murphy said “whatever can go wrong will go wrong” he didn’t mean that the world is “going to hell in a hand basket”.  He meant that if something isn’t specifically designed to avoid doing something, then it surely will do that very something.  If a tailings dam isn’t designed to avoid every potential type of failure, it surely will find that one way to fail. 

As a great friend of mine is fond of saying, when you start something like the analysis of a tailings dam (for example) you have to cast the net broadly. 

        

You May also Like