A couple of years ago, a woman comes into Beth Israel Deaconess Medical Center for a surgery. Beth Israel’s in Boston is the teaching hospital for Harvard — one of the best hospitals in the country. This woman comes in and she’s taken into the operating room. She’s anesthetized, the surgeon does his thing — stitches her back up, sends her out to the recovery room. Everything seems to have gone fine. And she wakes up, and she looks down at herself, and she says, „Why is the wrong side of my body in bandages?” Well the wrong side of her body is in bandages because the surgeon has performed a major operation on her left leg instead of her right one. When the vice president for health care quality at Beth Israel spoke about this incident, he said something very interesting. He said, „For whatever reason, the surgeon simply felt that he was on the correct side of the patient.”
When we are right we just feel right. Just like this surgeon did. He felt he was on the correct side of the patient. Just like most of us feel we are right about politics, economics, facts, health and our wellbeing. This visceral feeling of being right feels so real we rarely refrain to outside arguments to ‚objectively’ asses the truth. Even when we do. We’re in trouble. Why? Because of one strong cognitive bias. It is called motivated reasoning.
Motivated reasoning is an emotion-biased decision-making phenomenon studied in cognitive science and social psychology. It often gets us into trouble as we can easily find ways to justify our decisions and behaviours looking for the clues that fit our beliefs. I would say that motivated reasoning is confirmation bias taken to the next level. It leads us to confirm what we already believe, while ignoring contrary data. When we face the ‚truth’ that is too hard to handle for our brains we develop elaborate rationalizations to justify holding beliefs that logic and evidence have shown to be wrong.
Now pause for a second and think of one thing you are absolutely sure about that you are… right about. And think of the data that would make you think otherwise. Imagine you are wrong. Like really, really wrong. How does that feel? What evidence would you need? How in the first place did you conclude you were right? Did you analyse the contrary evidence or just jumped for joy when you found evidence that confirms your beliefs?
Motivated reasoning responds defensively to contrary evidence, actively discrediting such evidence or its source without logical or evidentiary justification. Clearly, motivated reasoning is emotion driven. It seems to be assumed by social scientists that motivated reasoning is driven by a desire to avoid cognitive dissonance. Self-delusion, in other words, feels good, and that’s what motivates people to vehemently defend obvious falsehoods.
This is a problem. Our judgment is strongly influenced, unconsciously, by which side we want to win. It seems as we treat the arguments that support our thesis as our allies whilst we treat the contrary evidence (no matter how convincing) as enemy. And this is ubiquitous. This shapes how we think about our health, our relationships, how we decide how to vote, what we consider fair or ethical. What’s most scary to me about motivated reasoning is how unconscious it is. We can think we’re being objective and fair-minded and still wind up ruining the life of an innocent patient (like in Beth Israel).
For one it is a flaw of our mind that probably has some tribal evolutionary explanation. For us as a species it was important to stick with the ones who shared the same beliefs and were ‚right’ about the stuff, while opposing those who were wrong. But there is one more thing that worries me. Throughout our education we are strongly advised not to be wrong. Think of your formative years in school. Probably by the time you were nine years old, you had already learned, first of all, that people who get stuff wrong are lazy, irresponsible dimwits — and second of all, that the way to succeed in life is to never make any mistakes. That is why you must feel right. And you think you are right. Or you think that you think.
1,200 years before Descartes said his famous thing about „I think therefore I am,” this guy, St. Augustine, sat down and wrote „Fallor ergo sum” — „I err therefore I am.” Augustine understood that our capacity to screw up, it’s not some kind of embarrassing defect in the human system, something we can (should strive to) eradicate or overcome. It’s totally fundamental to who we are. Because, unlike God, we don’t really know what’s going on out there. And it is good to be wrong. And it is alright to say… I do not know.