Sunday, January 8, 2012

Multiple causes, multiple consequences - Part 2

"Is there any other point to which you would wish to draw my attention?"
"To the curious incident of the dog in the night-time."
"The dog did nothing in the night-time."
"That was the curious incident," remarked Sherlock Holmes.


~ From "Silver Blaze" in "Memoirs of Sherlock Holmes" by Sir Arthur Conan Doyle


Noticing what didn't happen can be both difficult and important. Difficult because things that don't happen don't register on our senses, important because what is absent may provide important clues to what we need to do to either improve or conversely avoid disaster.

Consider the following examples:

Example 1: Edward Jenner's observation that milkmaids did not generally get small pox led to his discovery of vaccination ( from the Latin word for "cow"). By looking at people who dd not get the disease he derived a way of preventing it.
Example 2: During World War II, the patterns of bullet holes in returning aircraft were being studied to determine where they should be reinforced. Statistician Abraham Wald however had the insight that the bullet holes in surviving aircraft were clearly non-fatal and that it was the areas without bullet holes which were more likely to need reinforcing and this was confirmed from studying wreckages of planes that had been shot down. (Note: This is an  over-simplification: for Wald's original work see link below)
Similarly by noticing what information is missing we may prevent ourselves from making bad attributions in relation to causality.

Consider the following example:

Suppose in one group people follow strategy A which leads to major failure 90% of the time and outstanding success 10% of the time. And suppose that in a second group, they follow strategy B that rarely leads to major failure but which generally leads to moderate success. If the only information available to us was that strategy A leads to outstanding success then we might falsely conclude that strategy A is better than strategy B.

Jerker Denrell has published numerous articles on precisely this issue, how only studying successful organisations and individuals provides a misleading picture of the types of strategies that lead to success. For example in Predictng the Next Big Thing he argues that making an accurate prediction about an extreme event may in fact be an indication of poor judgement. It may simply be an indication that the person makes extreme judgements in general and that this time they got lucky. But unless we look at the full picture, we may conclude that such a person is some kind of genius.

So, we need to ask ourselves:
  • What am I not seeing that I should be seeing?
  • What is this person/organisation's track record like (i.e. not just their personal best)?
  • Is this attribute or characteristic common to failures as well as successes?
  • What isn't happening in this situation?
  • What didn't happen that was critical?
  • What information am I missing that is necessary to make a valid judgement?
  • What information do I need to collect and analyse to see what is really going on?
This last question is extremely important.

There are organisations where none of the following are documented: why a decision was made, what information and analysis it was based on, and what were the outcomes and consequences. As a result there is virtually no capacity within such an organisation to learn from errors or refine decision-making nor is there any accountability, a recipe for mediocrity.

The way to avoid this is to be vigilant in documenting what was done (and thus what may in retrospect be seen to have been overlooked) and to look for not just what happened but what didn't happen. It is the missing piece of the jigsaw that is needed to show the full picture.


"Psychology and Nothing" Eliot Hearst ( American Scientist Volume 79) is an interesting article which discusses the perceptual and cognitive difficulties of seeing what isn't there (Unfortunately, I haven't been able to locate a free version of this paper on the Web)

Failure is a Key to Understanding Success (Standford GSB News, January 2004)

The Weirdest People in the World (Heinrich, Heine and Norenzayan): Posits that most psychology research is based on a very narrow sample, people who are from Western, Educated, Industrialized, Rich, and Democratic (WEIRD) societies and that as a result the findings may not be as generaliseable to human beings in general as usually thought.).

A Method of Estimating Plane Vulnerability Based on Damage of Survivors (Abraham Wald)
Abraham Wald's Work on Aircraft Survivability (Mangel and Samaniego)

No comments:

Post a Comment