Our minds really do play tricks on us, thanks to faulty thinking patterns called cognitive biases. Propaganda and fallacies often exploit these biases—and not even scientists are immune. Let’s see some examples of cognitive biases and how to outsmart them.
You’ve probably noticed that your brain is an astounding device. Even now, it’s letting you translate the words on this page into a meaningful message you can assess and consider—while simultaneously registering a host of other inputs and regulating the functions that are keeping you alive. All this choreographed complexity reflects the glory of the human brain’s Creator. But in our fallen world, even the finest instruments can be prone to error now and then.
Errors in human reasoning include cognitive biases—faulty thinking patterns that help explain why false messages can sound true and may even mislead large numbers of people. As this blog post explains, cognitive biases tend to arise from the ways our brains use heuristics, or mental shortcuts, to quickly process information for making snap decisions.
For instance, we use the follow-the-majority heuristic to make snap decisions copying other people’s behavior, based on the assumption that the majority’s decisions are probably the wisest. This assumption can work in many cases—but clearly, not in every case. So, relying too much on the follow-the-majority heuristic leads to a cognitive bias called the bandwagon effect.
Advertisers may wield this effect by suggesting, “Everybody else is buying our product, so you should too!” However, arguing that a message, belief, or decision is correct because “everyone thinks so” is a fallacy—a faulty form of logic—called appeal to the people.
This example illustrates how heuristics and cognitive biases can lead us into falling for fallacies. On the flip side, research has shown that students who avoid cognitive biases tend to be better critical thinkers.1 Researchers have identified scores of cognitive biases, but let’s look at some of the most common examples to watch out for:
Have you noticed that it’s easier to remember the content of a message you heard a while ago than to recall that message’s source? For example, I’ll catch myself telling people, “I remember reading somewhere that . . . !” Although I can still clearly remember the message, I no longer have a clue where I read it.
This quirk of the human mind is called the sleeper effect. The sleeper effect becomes a problem if we originally discount a message that we heard from an unreliable source, but over time, we forget the source—and its unreliability—and begin believing the message.
Even after learning that a message is false, we may still find ourselves basing our thinking on that faulty belief. This tendency is so common that psychologists have a name for it: the continued influence effect.
For example, I remember counseling kids at a Christian summer camp, where I absently remarked that dinosaurs lived long before mammals. I knew that was an unbiblical, evolutionary idea.2 But I still repeated it automatically—during a camp Bible study! Thankfully, my co-counselor corrected me. But the incident showed me how the evolutionary story I’d heard so often had a continued influence on my thinking, even though I knew it wasn’t true.
A similar cognitive bias, the Semmelweis reflex, describes humans’ tendency to reject new information that opposes an established belief. This effect helps explain why false—but widely accepted—beliefs can take so long to correct in broader society, even as new information comes to light.
For example, so much observational data conflicts with the idea that all life evolved from one ancestor that even prominent mainstream scientists are speaking out.3 Yet outdated evolutionary ideas remain fixed in textbooks, the media, and culture. The Semmelweis reflex may be one of many possible factors (including spiritual ones) behind these ideas’ persistence.
Another important cognitive bias to watch out for is the illusionary truth effect, where a message sounds true because it’s repeated often. Repeated messages are easier to process, making the brain mistake familiarity for truth.4 Studies have found that not only do people rate repeated messages as more likely to be true, but repetition also increases people’s likelihood of calling a false statement true—even if they originally knew it was false.5
This effect shows how when students hear evolutionary origins stories repeated in their classes, they may find it increasingly hard to resist believing those stories, despite knowing they aren’t true.
Speaking of the power of repetition, have you ever found that the more familiar a person, place, or thing becomes, the more you begin liking it or them? This is the mere exposure effect—human’s tendency to prefer the things they encounter the most frequently.
Politicians, for instance, often harness the mere exposure effect by posting pictures of themselves around the city before an election. The more times people see the politician’s face and name, the more people may automatically begin to like and trust that candidate. This illustrates how the mere exposure effect can make for effective propaganda—a type of communication that persuades by relying on factors besides logic.6
Another cognitive bias that lends itself to propaganda is the framing effect. Framing effects happen when the same information, presented different ways, leads us to different conclusions. Take, for instance, the following two headlines:
The first headline is gain-framed, meaning it’s presented with a “glass-half-full” spin. But the second is loss-framed, underscoring the negative side of things. Both messages are logically equivalent, but they sound completely different. To thwart the framing effect, simply ask yourself what the other side of a statistic is showing.
Along with framing effects, another cognitive bias that advertisers love exploiting is the halo effect. This bias describes the human tendency to assume that a person who excels in one area (like physical appearance) also excels in other areas (like kindness, humor, or intelligence).
If we stopped and thought about it, we’d realize this assumption isn’t necessarily true. Yet the brighter someone’s “halo effect” glows, the more convincing we may find their messages—even though it’s a genetic fallacy to believe a message based only on the type of person communicating it.
Another kind of genetic fallacy, appeal to authority, occurs when we believe a message is true simply because an expert is communicating it. Experts are typically the most reliable human information sources. But even experts can make mistakes, believe wrong information, and are biased by their worldviews. So, making snap judgements based only on expert opinion—that is, using the authority heuristic—isn’t necessarily logical. Yet for better or worse, humans rely on the authority heuristic so often that the tendency has earned itself a nickname: the authority bias.
And as it turns out, even experts, being human, are subject to cognitive biases. One example is the observer-expectancy effect, when scientists inadvertently manipulate or misinterpret their research to confirm an expected result.
A related bias, selective perception, describes how what we expect to see can ultimately influence what we think we really do see. One classic study, for instance, asked students to watch the same college football game and report how many rule infractions they saw the teams committing.7 Students tended to underreport their favorite team’s errors or overreport the opposing team’s errors, illustrating how people’s prior stances can affect their perceptions.8
The last cognitive bias we’ll look at for now is called the belief bias, which describes how humans tend to rate an argument’s strength based on how believable its conclusion sounds rather than how logical the argument is.
For example, the argument, “All cats are animals. Some animals are mammals. Therefore, all cats are mammals,” might sound right because its conclusion is true. However, this argument isn’t logical.9 But research shows people fall for the belief bias less often when they have more time to think about the argument rather than making snap decisions about whether its conclusion “makes sense.”10
All these cognitive biases represent just a few of the many ways our minds can “play tricks” on us. In our fallen world, even our fearfully and wonderfully made brains can fall into less-than-logical thinking patterns here and there. But oftentimes, all it takes to override these faulty thinking patterns is just a little biblical critical thinking.