“For some of our most important beliefs we have no evidence at all, except that people we love and trust hold these beliefs. Considering how little we know, the confidence we have in our beliefs is preposterous—and it is also essential.”
2002 Nobel Laureate Daniel Kahnemanii
An astute philosopher suggests making a daily habit of looking in the mirror and asking ourselves to consider whether everything we believe with complete certainty might be wrong.
Beliefs are our brain’s way of making sense of and navigating our complex world. Our brains rely on these energy-saving shortcuts to organize and evaluate the overwhelming amount of information we must process, and to form predictions about what might happen next. This enables efficient learning, and is often essential for survival. More often than not, the beliefs we form are fairly reliable representations of reality. But the trade-off for this efficient way of making sense of the world is that not infrequently our beliefs are mistaken. Adding to their fallibility, many of our beliefs are formed emotionally and unconsciously – our rationalizations for them are formed after the fact, in our attempt to explain and justify them to ourselves. And many of our beliefs are simply intuitive gut feelings: they just “feel” right.
Beliefs are often concerned with understanding the causes of things, and are based on pattern recognition: If ‘b’ closely followed ‘a’, then ‘a’ might be assumed to have been the cause of ‘b’. Our brains connect the dots. While this is often a good approximation of reality (e.g. we got sick after eating spoiled food), our brains have a habit of over-identifying patterns where they do not exist (e.g. our child first showed signs of autism around the same age as he received his first MMR vaccination – we fail to recognize that this happens to be exactly the age when the most obvious signs of autism first become evident; the MMR is irrelevant).
We are not just pattern-seeking, we are also intentional agents – we are wired to behave in motivated, purposeful ways in almost everything we do. And as social animals, we have evolved to be very adept at recognizing purposeful, intentional action on the part of other people, as well as on the part of predators or preyiii.
The problem is that we are so primed to connect the dots, and to perceive intention, that we are inclined to over-identify pattern, and to over-attribute agency or purpose to inanimate objects and random natural occurrences. This is the evolutionary basis of animism (the belief that plants, inanimate objects, and natural phenomena have souls) and of spirituality in general. Most of us no longer believe that storms, earthquakes or volcanoes are the work of angry, vengeful agents. But many of us still wonder if illnesses strike us because of some moral failing, or maybe in order to teach us a valuable life lesson.
We have a particular propensity to think that coincidences are meaningful (everything happens for a reason). The familiar example is when someone we were just thinking about phones us. In actuality, for such a coincidence to truly be remarkable, we should have made a prediction ahead of time that this particular event would occur at this particular time (tip: write down your prediction every time you make one, and meticulously record both the hits and the misses). Otherwise we cannot conclude anything about the probability of this specific coincidence. We have merely observed the probability of any subjectively resonant coincidence occurring at any time (maybe our wedding theme song will play on the radio just after thinking about it, on any arbitrary day). We are unjustifiably impressed, judging the seeming improbability of an event after it has occurred, and assigning it significance through our egocentric, self-referential lens. In fact, the more eerie and spine-chilling a coincidence feels, the more skeptical we ought to be about our habit of thinking that the universe is intentionally designed for us.
This general human tendency to think that “everything happens for a reason” (… and it’s all about me) can be magnified to an extreme in mental illnesses. In states of psychosis and mania, people tend to develop blatantly false unshakable beliefs – delusions – which can range from plausible yet mistaken beliefs to extremely bizarre ideas. The most common types of delusions involve the false belief that unrelated, coincidental, or innocuous events, actions, or objects refer to the individual in a personal way. My patients suffering from delusions perceive deliberate intention in random events. They detect hidden messages or signs and tell me that certain events couldn’t possibly be mere coincidence, and they are convinced that these refer to them – usually in paranoid ways, sometimes in grandiose ways. They present all kinds of evidence that they consider irrefutable in support of such assertions. The problem is that they are connecting too many dots.
The fallibility of normal human subjectivity, with all the cognitive biases that regularly escape our notice, is the reason we must rely on the scientific method to probe reality more objectively. The scientific method relies on rigorous critical thinking, a healthy amount of skepticism, scrupulous attention to detail and methodology, and harshly critical peer review. It’s very much harder than most people realize to demonstrate with a high level of confidence that ‘a’ is the cause of ‘b’. Ten anecdotes are no more reliable than one anecdote, and one hundred are no more reliable than teniv.
A lack of critical thinking might seem harmless when it influences people’s behaviours for relatively minor decisions, like buying homeopathic remedies or engaging in superstitious rituals. Disenthralling people by debunking their cherished beliefs may seem unnecessary or even unkind. But an inability to critically appraise whether ‘a’ causes ‘b’ tends to inevitably influence more consequential choices and behaviours. As the great science popularizer and astronomer Carl Sagan put it, “Gullibility kills.”v
For example, credulous beliefs regularly have tragic consequences for suggestible, vulnerable and desperate patients with potentially treatable cancers. Far too many people choose alternative therapies, whose practitioners tell them what they want to hear, while rejecting scary but evidence-based chemotherapy. They are frightened by their oncologist’s inability to give them the certainties they seek.
The good news is that critical thinking, while not natural or intuitive, can be learned, and scientific literacy is becoming much more widespread in modern societies. We’ve come a very long way from making human sacrifices to appease the gods, burning witches who lived next door for causing our household’s bad luck, and exorcising demons from people suffering from psychosis. Science is a candle in the darkvi.
title The origin of this insightful and pithy quote is uncertain. It has been used elsewhere by several writers, e.g. Thomas E. Kida, Don’t Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking. Amherst, N.Y: Prometheus Books, 2006
ii Daniel Kahneman, Thinking, Fast and Slow, New York: Farrar, Straus and Giroux, 2011, p. 209
iv Frank Sulloway, quoted by Michael Shermer in “Show Me the Body,” http://www.michaelshermer.com/2003/05/show-me-the-body/ (date posted May 2003)
v Carl Sagan, The Demon-Haunted World: Science as a Candle in the Dark. New York: Random House, 1995, p. 218
vi Carl Sagan, The Demon-Haunted World: Science as a Candle in the Dark. New York: Random House, 1995