But Beliefs Become Barriers
If we want to understand how we err, we need to look at how we believe. Kathryn Schulz
In order to change your mind, you have to change from believing you were wrong to believing you are right. Or from believing you are right to believing you were wrong. The hard part for humans is changing what you believe. Today we need to look at how we believe, in order to help us decide how to change our minds.
We are not the rational decision making animals we would like to believe we are; we are more likely the rationalizing animal. People prefer to justify mistaken beliefs rather than change the beliefs. This is now a major problem in American political discourse today.
But it isn’t only a problem in politics. The Oxford Dictionaries website chose “post-truth” as the word of the year 2016. It defined post-truth as, relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief. In the post-truth era borders blur between truth and lies, honesty and dishonesty, fiction and non-fiction. Deceiving others becomes a habit. A recent think piece in Huffington Post labeled “Post-Truth Nation” stated this idea succinctly: the greatest problem of our future is not political; it is not economic; it is not even rational. It’s the battle of fact versus fiction.
The cause of not being able to identify fact vs. fiction, or the unwillingness to acknowledge when we were wrong, is the well-known and well-accepted confirmation bias, sometimes called “myside bias” (H. Mercier & D. Sperber). Confirmation bias is defined as: When we notice and remember information that confirms what we believe, and ignore, forget or minimize information that disconfirms what we believe. This is why sometimes people adopt misinformation instead of information. If reason is designed to sound judgements, confirmation bias is a deterrent to reason. Changing one’s mind seems like a crime, because it usually requires a change in beliefs. And because of the rationalizing human (to devise self-satisfying but incorrect reasons for one’s behavior) this is one reason humans don’t change their minds or behavior.
And because it is so easy to get fact and fiction, information and misinformation on social media, it is so hard to distinguish. Changing your mind isn’t a crime, but it is very hard to do nowadays. The “crime” occurs when we Ignore or deny being wrong. This is called “error blindness” — whatever falsehoods each of us currently believes are necessarily invisible to us, (Kathryn Schulz.) One’s false beliefs, that is one’s errors of judgement, need to be invisible in order for one to remain comfortable and confident.
When we are wrong, we need to understand how we are wrong; we have to understand where our beliefs came from and how they affect what we “know”. False knowledge is usually the result of error blindness, which is usually the result of confirmation bias.
Faced with the choice between changing one’s mind and proving there is no need to do so, almost everyone gets busy on the proof. John Galbraith