In this post, I’ll continue my discussion of key ideas from Taleb’s The Black Swan, with an examination of: the fallacy of silent evidence, confirmation error (or platonic confirmation), epistemic arrogance, and future blindness.
The Fallacy of Silent Evidence. Nassim Nicholas Taleb (NNT) points to the problem of our tendency to view historical evidence with a filter that selects “the rosier part of the process” while ignoring the parts of the process that don’t fit our preconceptions, or at least not recognizing or forgetting that there is a part of the historical process that is inaccessible to us and therefore, silent. The effect of creating such a historical record is to exclude our awareness of evidence that doesn’t fit our mental models. For example, let’s say we think that banks are robbed by people whose first name is John, and we do a Google search for bank robbers named John. The evidence that there are bank robbers whose name is “Not-john” will be silent, because Google hasn’t retrieved that part of the record. NNT talks about our tendency to view famous authors as uniquely talented, and to attribute their success to their talent. However, we have no access to the works of the hundreds of thousands or millions of authors who never have had their books published. So evidence of their talent, or lack of talent is silent, and we cannot evaluate whether talent explains authorial success or not, and should not therefore infer that “talent” is the explanation for success based only on our observation that successful authors are talented.
Confirmation Error (or Platonic Confirmation). This error is similar to but different from the fallacy of silent evidence, because one actively looks “for instances that confirm your beliefs . . . and find them,” without looking for negative evidence that refutes them. The paradigm case here is that of the partisan in an argument who looks for facts that confirm his thesis, but ignores or hides everything that may weaken it. In political campaigns candidates do their best to point to facts that suggest they are the right person for the position they are running for; but they do everything they can to hide or obscure those parts of their record that suggest otherwise. The Bush Administration’s activities in the run-up to the War in Iraq provide a beautiful example of platonic confirmation. Every piece of data that could possibly be used to support the conjecture that Saddam’s regime had weapons of mass destruction was marshaled to support the case and persuade Congress and the public that those weapons existed; and every piece of data that called into question that conclusion was ignored. A perfect case of confirmation error.
Epistemic Arrogance. According to NNT, the difference between what someone thinks they know and what they actually know is critical. If what they think they know exceeds what they actually know, that is “epistemic arrogance.” If what they actually know exceeds what they think they know, that is “humility.” NNT favors those who are humble and most favors those who hold their “own knowledge in greatest suspicion.” Again, the Bush Administration, seemed to be characterized by a high degree of epistemic arrogance in relation to foreign policy, economic policy, on social issues, and apparently on legal matters as well. It’s too early to know whether it is the same for the Obama Administration. In Knowledge Management it is best to teach and encourage a culture of humility so that people will test their knowledge most severely to ensure that what one is relying on has survived testing and proved strong.
Future Blindness. This is the idea that we seem unable to anticipate what the future will feel like, even when we can predict what will occur and we have had similar experiences in the past. NNT mentions the experience of buying a new car and being very excited at the prospect, and the anticipated joy and happiness that will ensue from the new car experience. But we forget what out last new car experience was like, and, in particular, we forget that after a very short time, perhaps just a few weeks, we will get used to our car, take it for granted, and will not experience the initial uplift we felt when first acquiring it. Another example of this may be a woman’s prediction of the pain she will feel in child birth. Before having her first child, it is hard for many women to imagine the pain that may accompany the experience, and once a woman recovers from childbirth and heals fully, the memory of the pain tends to fade, and her anticipation of having a second child will often not accurately forecast the experience of pain.
All four of these ideas are important for Knowledge Management. If we can train ourselves to mitigate future blindness, then we can be much more attuned to what the occurrence of Black Swans may mean to us. As for the other three, the first two ideas seem to be related to the ideas of epistemic humility and fallibilism. Fallibilism encourages suspicion about what one thinks one knows and leads to epistemic humility. And epistemic humility, in turn, will help us to overcome confirmation error and the fallacy of silent evidence. As to the first, if we are skeptical of what we know, we will not only look for what confirms it, but also for what refutes it. And if we are skeptical of what we know we will also be much more alert about drawing positive conclusions in situations where silent evidence exists and is not accessible to us. So, fallibilism can lead to epistemic humility, avoidance of confirmation error, and also acknowledging the existence of silent evidence that makes it unwise to draw conclusions from partial evidence. Fallibilism and epistemic humility can also make us more aware of the possibility of unexpected events violating our mental models, and this awareness in turn can give us motivation to try to overcome future blindness.