The Most Common Cognitive Bias

 Man prefers to believe what he prefers to be true. Francis Bacon

Many years ago Francis Bacon gave a good definition of confirmation bias, if you can forgive his male dominance language. Or how about Karl Weick: Your beliefs are cause maps you impose on the world after which you “see” what you have already imposed. This blog is about how our beliefs often aren’t facts. And about the most common bias, the confirmation bias, when we interpret evidence to confirm to our existing beliefs. As promised, I want to help us get better acquainted with this bias.

It is tempting to point out Donald Trump as the best example of the role of the confirmation bias in his 5,000 lies. He certainly believes to be true what he prefers to be true. However, there seems to be more serious mental, psychological issues. How about looking at how man and woman prefer to believe?

Man, woman, you, me, and republicans and democrats all are good at employing confirmation biases. We do it without knowing we do it, and we do it knowing we are doing it. The prevalence of decision biases is a function of human habit. And a product of human nature — hardwired and highly resistant to feedback.

To me, this is a significant issue today, maybe even a crucial issue, because of the role of bias in current political discourse and the current popular literature about the role of bias in human decision making. Reason and reality are hard to find in politics and rational thinking is hard to find in personal decision making. A common culprit is the popular confirmation bias.

To put it simply, the confirmation bias is when we believe to be true what we want to be true. This is how man and woman prefer to believe. Most people have very little awareness of what or howthey believe. Belief awareness is like self-awareness — it is hard. And you probably can’t do either one by yourself. To get acquainted with and understand your beliefs may require some outside help. And the main reason for help is the confirmation bias.

Ask someone for help. Is what I believe really true? Get feedback from friends and “enemies”. Am I employing the confirmation bias? Or some other cognitive bias? Discuss these biases with others. Get acquainted with your belief system. This constant self-reflection is necessary and helpful because of the common bias blind spot. Researchers find that everyone has a bias blind spot  —  the failure to notice our own cognitive biases. The bias blind spot is thetendency to recognize the impact of biases on the judgment of others, while failing to see the impact of biases on one’s own judgment. This is why getting feedback from others is helpful, and hard.

 Why do you see the speck in your brother’s eye but fail to                                                       notice the beam in your own eye?  Luke 6:41

 Human self-deception is one of the most impressive                                                            software programs ever devised. David Nyberg.


Posted in Beliefs | 2 Comments


Can They Pretend They Don’t See?

The answer, my friend, is blowing in the wind.  Bob Dylan

How many times must a president tell a lie, before we know he isn’t truthful? How many times must a president display incompetence, before we see him as incompetent? How many times must a republican congress ignore the presidential unfitness, before they declare him unfit?  The answers, my friend, are blowing in the wind. Apparently enough politicians aren’t willing or capable of grabbing those answers. Are they being unwilling or incapable is the crucial issue. Incapable means not able to know the answers to what is wrong. Unwilling means knowing what is wrong but not admitting it.

The answers that are blowing in the wind are maybe being caught by others who are not in The White House or in Congress. How many times can the public observe unfitness, unwillingness and/or incompetence in government before they can see? Current events suggest that now may be the time. Maybe, finally, too much is too much.

It seems obvious that the voting public can no longer wait for the elected politicians to  wake up and see. Democracy is supposed to be of the people, by the people and for the people. The people are soon to be given a chance to make democracy become real.

We may have reached a time when too many times has occurred. Voters may be catching the answers in the wind and voting accordingly. We will see.



Posted in Democracy in Danger | 3 Comments


Who Could Ask For Anything More?

 Everything begins with belief. What we believe is the                                                                 most powerful option of all. Norman Cousins

I’ve got beliefs, and I believe they may be one of the most important parts of me. They are key ingredients of my cognitive tool box. Maybe also the most important part of my decision making skills and my behavioral repertoire. That sounds important to me.

But beliefs are subjective and subject to cognitive biases. Beliefs are not all the same; they vary in certainty. I believe there is a belief continuum.  Dogmatic beliefs at one end, tentative beliefs at the other end. Dogmatic believers believe their belief is the absolute truth, no questioning. Tentative believers believe their beliefs are hypotheses, open to questioning. Other people don’t think much about what they believe and don’t want to bother questioning. These are Passive believers, somewhere in the middle of the continuum.

I favor tentative believers; they investigate the origin and utility of their beliefs. They seek evidence to support or revise their belief; this is being open-minded. Of courseall beliefs are clearly not hypotheses. We all have beliefs that bind us and blind us. And we all have built-in biased mechanisms for believing that our beliefs are true and other’s beliefs are false (cognitive biases). The first step toward illumination is admitting that we have these biases. And acknowledging that they may not be the truth. Next step is a process of investigating and understanding the cognitive biases behind our beliefs.

Today cognitive biases is the hot topic in the popular decision making literature. It is also big news in political news reporting. My blog writing has been promoting a collective worldview that is open-minded and inclusive. Open-mindedness and inclusiveness are two deterrents to cognitive biases. A belief that is open-minded is not certain or dogmatic, and can be changed. A belief that is inclusive is not subject to tribalism. Political decision making today is clearly mostly biased. So is personal and organizational decision making. Can this be overcome by getting acquainted with and identifying cognitive biases? Maybe we should practice by getting acquainted with the cognitive biases by starting with the most common cognitive bias, the Confirmation Bias. Stay tuned.

Confirmation bias, also called myside bias, is the tendency to interpret new                            evidence as confirmation of one’s existing beliefs or theories. Itis the tendency to             search for, interpret, favor, and recall information in a way that confirms one’s preexisting beliefs or hypotheses.When people would like a certain idea/concept to be true, they end up believing it to be true.







Posted in Beliefs | 1 Comment


 And Bounded Rationality 

 A personal history of decision making advice.

 Moral algebra (1772) and bonded rationality (1958) are examples of decision making advice. Since 2010, my blogs have been my contributions to decision making advice giving. This blog is a partial review of some decision advice I have recorded about our ability to engage in rational, meaningful decision making, including my own.

Benjamin Franklin in 1772, gave his famous decision advice to a friend and called it Moral Algebra. This became known as the “pros and cons strategy”, although Franklin included some techniques of personally weighing and balancing the pros and cons, thus Moral Algebra. Take the choice with the most pros. It may not have been the first decision advice, but it is one of the most famous early ones.

In 1958, Herbert Simon won the Nobel Prize for his concept of bounded rationality: the idea that when individuals make decisions, their rationality is limited by the available information, the tractability of the decision problem, the cognitive limitations of their minds, and the time available to make the decision. Decision-makers in this view act as “satisfiers” seeking a satisfactory solution rather than an optimal one.

One of the biggest lessons learned over these years is about the power of unconscious thinking. Willis Harman (1998) pointed out that the unconscious mind influences our decisions and can be considered a gold mind or a rubbish heap. In the next 20 years several popular books have tried to explain how to make “somewhat rational decisions”.

To name just two. In his 2006 book The Happiness Hypothesis, Jonathan Haidt came up with this metaphor image as he marveled at his weakness of will. I was a rider on the back of an elephant. The rider is Haidt’s rational mind and the elephant is his emotional mind. He says the rider is an advisor or servant, not a king, president or charioteer with a firm grip on the reins. The rider is conscious, controlled thought. The elephant is everything else. In Haidt’s metaphor the rider can’t just decide to change and then ask the elephant to go along with the program. Lasting change can come only by retraining the elephant, and that’s hard to do. The Buddha explains: The mind is a wild elephant.

Thinking Fast and Slow, 2011 by David Kahneman is the most recent book, and maybe the current standard bearer. Kahneman won the Nobel Prize in 2002, changed popular theory and made it clear that human decision making is not, and probably cannot be, totally rational. Hesummarized current thinking about thinking by identifying two types: System 1(fast, automatic, intuitive) and System 2 (slow, effortful, rational). Both System 1 and System 2 are subject to cognitive biases. So, there you have it: two imperfect systems for making judgments and decisions. How do we improve these? What Kahneman suggests is toget better acquainted with our cognitive biases.I plan to make that a major objective of future blogs.

Decision makers will make better decisions when they expect their decisions to be judged by how they are made, not only by how they turn out.  David Kahneman


Posted in Beliefs | 1 Comment


Is The Way I See Things

Three baseball umpires                                                                                                                                I call ‘em the way I see ‘em.                                                                                                                          I call ‘em the way there are.                                                                                                                They ain’t nothing until I call ‘em.

 This famous baseball story is a great metaphoric illustration of the way we see things. We usually think we see things like the second umpire: the way things are. We think other people see things like the first umpire: the way they see thigs. The third umpire’s view: They ain’t nothing until I call ‘em, highlights the notion of personal perception, seeing and calling our reality. Nothing exists until we perceive, label, and interpret it.

In this blog I want to use the baseball metaphor to illuminate the way I see things, and maybe the way you see things. I am hoping that the popular umpires’ story might have some benefit. Each umpire actually believes they see things the way they say they see things. You and I actually believe we see things the way we say we see things. The way the three umpires see things could help us see the way we see things. I can realize I see things the way I see ‘em. And they ain’t nothing until I see ‘em and call ‘em. Which way do you describe the way you see “em?

Notice that the way I “see ‘em and call ‘em” depends on my personal perceptions and subjectiveinterpretation. Perception is the key.Some descriptions of perception:

Perceptions are portraits not photographs. Daniel Gilbert

Much of what we take to be perceptions are actually conceptions,                                                mental and not empirical.  Ken Wilber

By the time perceptual information reaches consciousness, each individual has transformed it into something new and unique.  Andrew Newberg and Mark Waldman

 Kathryn Schulz, in her book, Being Wrong, says the major reason we can get things wrong is that our perception of reality is always our interpretation of reality; this implies wiggle room. I have often written about this wiggle room in my blogs. For example: Believing is seeing.Whenever there is belief there is interpretation and room for error. The reason the wiggle room of perception is important is that the way we see things determines the way we do things. Perceptions become behavior.

Perception is not only fallible; it is also partial. This Partial Blindness, is our inability to see the wholeness of realityWe are visually impaired observers”; what we see is not all there is. Our view of reality is partial, incomplete: an “isolated observation”, a snapshot; not the “big picture”. In all visible things there is a hidden wholeness, Thomas Merton.

The way the three umpires see things can help me see and understand and illuminate the way I see things. I can ask myself: “Is the way I say I see things actually the way I see things? You might ask yourself the same question.

We don’t see things as they are; we see things as we are. Anaïs Nin

Posted in Beliefs | 1 Comment


Is It Possible?

 The goal in America today is to resurrect the primacy of reason over passion.              Jeffrey Rosen,  President of the National Constitution Center.

 This blog is the result of my reading the October 2018 issue of The Atlantic, “Is Democracy Dying?”  And since the title and subtitle above are almost a summary of my200 plus decision making blogs, this blog is about democracy and decision making. Democracy and decision making are clearly interconnected

The system of government delineated in the constitution is a concession to the idea that humans are deficient in the science of rational self-government.                                         Jeffery Goldberg, Editor Atlantic Monthly.

Several articles in The Atlantic: “Loosing the Democratic Habit, The Threat of Tribalism, A House Divided, Building an Autocracy”, make it clear that democracy is being threatened by out-of-control political favoritism, tribalism, self-serving biases, and emotional thinking. Rational self-government requires rational decision making. A healthy democracy depends on rational decision making by politicians and voters. The overwhelming presence of undisciplined social media has almost eliminated rational thinking. Facts and truth (reason and rational thinking ) are hard to come by.

Inflammatory posts based on passion travel farther and faster than                         arguments based on reasons. Jeffery Rosen

A lie can travel halfway around the world while the truth                                                                is putting its shoes on. Mark Twain

 Democracy and decision making are also interconnected with beliefs. My blogs, starting in 2012, have been about the categories of beliefs and democracy, in addition to other decision making categories. This Atlantic issue relates closely to my beliefs about beliefs and democracy. For example: Democracy is a most unnatural act. People have no innate democratic instinct; we are not born yearning to set aside our own desires in favor of the majority’s. Democracy is, instead, an acquired habit, Yoni Appelbaum.

I believe rational decision making is a most unnatural act. People are not born to be rational, objective  decision makers. Credulity appears to be an instinct. Open-mindedness needs to be learned. The Atlantic recommends reviving democracy by teaching it early in schools. I also recommended teaching decision making in schools. I published decision making curriculum for high school students in 1973.

The Atlantic articles raise the question: Is democracy dying? I raise the question: Can decision making become a rational process? Is reason over passion an achievable goal? Time will tell.

 If humans are deficient in the science of reason in self-government, it may also be that humans are deficient in the science of rational decision making.

Could it be that a democracy of reason isn’t built for humans?                                             Could it be that rational decision making isn’t built for humans?

Posted in Beliefs | 2 Comments


 As Decision Rules Of Thumb                   

 A heuristic is a mental shortcut that allows people to solve problems and make judgments quickly and efficiently. These rule-of-thumb strategies shorten decision-making time and allow people to function without constantly stopping to think about their next course of action. Heuristics are helpful in many situations, but they can also lead to cognitive biases. (Verywell Mind).

Much has been written lately about heuristics and cognitive biases. I also have contributed several recent blogs. This blog is a review of the advantages and disadvantages of these decision making short-cuts. Heuristics could also be called Rules of Thumb. A rule of thumb is an easy to remember guide that falls somewhere between a mathematical formula and a shot in the dark. Tom Parker

 Cognitive biases seem to always be getting the way of our thinking and deciding. Recognizing that heuristics and rules of thumb decision strategies can be both helpful and harmful is beneficial in today’s world of information overload. Rules of thumb were first used to make up for lack of facts. Today we need rules of thumb because of too many facts, which can be problematic in decision making. Sometimes you don’t have the time or the ability to discover the best way to do something. Or there may not be a best way. This is when you need a homemade recipe or an easy to remember guide.

In 1991, I introduced four paradoxical decision principles, that could be considered rules of thumb or heuristics, in my book Creative Decision Making; With Positive Uncertainty.*

  • Be focused and flexible about what you want. This principle will help you create your goals and discover new ones.
  • Be aware and wary of what you know. This principle will help you to appreciate knowing and appreciate not knowing.
  • Be realistic and optimistic about what you believe. This principle will help you realize that your beliefs influence your reality and your behavior.
  • Be practical and magical about what you do.This principle will help you use both you head and your heart in deciding.

Totally rational, by-the -book, decision making is considered almost impossible today. So a little help from non-rational, intuitive decision strategies would seem useful. My many blogs since 2012 are full of heuristics and rules of thumb for decision makers. For example, Always be rational, unless there is a good reason not to be.

Since we apparently can’t usually make rational decisions, we need some non-rational strategies for deciding. Humans need all the help we can get.

The best-laid plans of mice and men are usually about equal. Murphy’s Laws

  • Crisp Publications. Creative Decision Making, Using Positive Uncertainty, 1991; Revised edition 2003, with Carol Gelatt as coauthor.



Posted in Beliefs | 2 Comments