Mental Heuristics and Biased Thinking
BY: T. Franklin Murphy | February 2018 (re-written February 18, 2021)
Our narrow minded thinking filters new information. We get stuck in cognitive biased ruts, defending narrow perspectives, and protecting cherished deceptions.
Our mind colors information, blending, shading and distorting reality to fit personal schemas. We create a stable, articulable world full of images and theories that we refuse to examine. Our narrow minded thoughts filter new information, biasing our perception of reality to fit preconceived notions.
We constantly monitor the inner and outer world for information. Data flows into our brain from sense organs (eyes, nose, ears, etc...); the information is converted to some kind of meaningful code; and then we conduct mental operations to manipulate and transform the information into something useful.
The mental operations adhere to personal rules or heuristics. These heuristics speed processing, allowing for quick reaction to novelty by transforming them into already accepted theories about the world. Speedy processing provides an evolutionary advantage by improving predictions about the future. We predict danger and opportunity.
We create rules for processing through experience. Without the backdrop of prior knowledge, reactions stall as our brain sludges through overwhelming flows of fresh facts, sorting through what is relevant or important is impossible without some standard of measurement.
Heuristic are our standard of measurement. The unquestioning truths we rely on to process new information. Robyn Dawes and Reid Hastie warn in their probing research into rational choice that "heuristics are efficient, but sometimes inaccurate procedures for solving a problem...providing rough-and-ready estimates of frequencies, probabilities, and magnitudes" (2009, location 1908).
Amos Tversky and Daniel Kahneman wrote "in general, these heuristics are quite useful, but sometimes they lead to severe and systematic errors" (1974, p. 1124).
Our biased interpretations speed thinking; but at the expense of accuracy. Sometimes speed trumps accuracy; other times hiders. We act with crude interpretations inferred from fragmented information—biases then fill the spacious unknowns.
We interpret the world on preferred meanings, and then conveniently ignore the fallibility of our thought. We are narrow minded. Once we act on these beliefs, our investment in their truthfulness strengthens, enhancing the bias. Subtly we bypass conflicting new facts in favor of comforting interpretations of irrelevant data.
Our beliefs have a cascading effect. The more we act or argue a belief, whether or not it is true, the stronger that belief becomes (Harrison, 2013, location 1192).
A heuristic is a mental shortcut that allows an individual to make decisions, pass judgments, and solve problems quickly and efficiently. Heuristics are unconscious rules or biases we use to come to conclusions with minimal information.
We smoothly discard facts in many ways. We discount evidence, citing a flaw in the presentation—the presenter stutters or wears a funny hat. We artfully skip over applicable knowledge by focusing on their silly cap. We do this because conflicting data drains mental energy. To accept new information, we must reconcile conflicting beliefs and humbly accept error.
By focusing on a flaw, we reject the entirety of a worthy message without straining to address troublesome conflicts with our dearly held positions. We naturally disprove opposing ideas and quickly welcome supporting evidence (even when the evidence is simple speculation).
Politics and Narrow Minded Thinking
Politics provides constant examples of heuristics in action; we overlook or accept political arguments based on our pre-conceived notions. If you are an ardent supporter of the person serving as president, it is unlikely any of her (or his) antics will be questioned as unethical or maddening. Conversely, if you dislike him (or her), then it is difficult to accept any of his (or her) actions as good for the country.
Stepping away from our personal opinions momentarily, we can examine the circus, watching supporters disdainfully point to concerns over the opposition while sweeping away egregious behaviors by their favored party. Even brutish bigotry can be softened with words and explained as logical. We need to stop the dumbing down of politics, step back with a wider and more comprehensive view, cry foul when foul behavior is displayed on either side of the isle.
Biased Thought: Our thoughts are constructed from our experience of how the world relates to the facts we observe. Culture, religion and family upbringing provide a foundation for understanding. We all are biased; it's how the mind works.
Old ways feel comfortable, requiring little thought, so we put on blinders and cozily continue, as we all have. We protect the security of the past at the expense of learning because we are conditioned to do so—biologically we are inclined to act on past knowledge.
We need a foundation of knowledge to smoothly navigate the constant and heavy flow of information. Our survival depends on inferences. We couldn’t function without a basis for understanding. The world is too complex. We must predict danger and advantages early—and react accordingly, even when some of these actions are foolish.
Stop, Pause, Think
We repeatedly face choice between security and knowledge. Speedy interpretations spontaneously occur, initiating a behavioral reaction; but we are not doomed to follow these inclinations. Once cognition catches up, we can challenge our narrow minded direction, employ different strategies, purposely investigating different possibilities.
Our past framework for understanding isn’t magical—it’s subject to error. Emotional learnings from the past may have been appropriate then but now are misguided, limiting progression, forcing a cycle of tired worn out routines. Through mindfulness, we can expose some of our limiting biases in favor of more complex understandings. Open exploration is the nexus of growth and wisdom.
When you feel an urge to reject discomforting information, step back (if it is safe) and look a little deeper, ask new questions, and consider new endings. You’ll find many great hidden gems of wisdom beyond the confining boundaries of narrow minded and biased thought.
Please support FLS with a share:
Harrison, G. (2013). Think: Why You Should Question Everything. Prometheus; Illustrated edition
Hastie, R., Dawes, R. (2009). Rational Choice in an Uncertain World: The Psychology of Judgment and Decision Making. SAGE Publications, Inc; Second edition
Tversky, A., Kahneman, D. (1974). Judgement under Uncertainty: Heuristics and Biases. Science Volume 185.