Why Bad Logic Can Often Sound Persuasive

by Patricia Engler on August 19, 2020

Even when we know they’re not true, arguments that rely on propaganda and flawed logic can still sound incredibly persuasive. Let’s explore the science of persuasion to see why this happens—and how to overcome it.

All eyes in the classroom watched as the professor paced the front of the room, an aura of knowledge seeming to stream behind him like the tail of a comet. With flawless eloquence, he wove accounts of ape-like humans walking upright, developing language, and inventing religions.1 Did everyone else in the class agree? They seemed to. And I couldn’t blame them. The professor made everything sound so . . . true.

Of course, no message is true simply because a professor believes it, because it’s eloquently worded, or because everyone else seems to agree. Saying so would be appealing to different types of faulty logic called fallacies of irrelevant premises. Given the truth of God’s Word, affirmed through internal consistency and external reality, we can be confident that messages which oppose Scripture are not completely true. And yet, like I experienced as a student, they can still sound persuasive.

Why do faulty lines of logic sometimes sound persuasive, even if we know they’re false? To find out, let’s dive into the psychology of how persuasion works in the brain.

The Science of Persuasion

Many theories of persuasion look at two channels the brain uses to make decisions.2 Researchers sometimes use different names for these channels, but essentially, one channel involves careful, logical reasoning, while the other involves automatic, intuitive thinking. The automatic channel operates by making snap decisions based on cues that don’t usually have anything to do with logic, like how something looks or how it makes us feel.

Propaganda, which tries to persuade by appealing to something besides logic, exploits this automatic channel. To do so, propaganda taps into decision-making shortcuts the brain often uses, called heuristics (rhymes with “few mystics”).

Heuristics work like mental rules of thumb to help us save time making everyday decisions. For instance, we use the follow the majority heuristic when making decisions based on what other people are doing, following the rule of thumb that the majority’s decision is probably best. This heuristic may work in many cases (think one-way streets or dinners with multiple forks); yet as history shows us, public consensus is not always correct. So, as useful as heuristics are, they can also mislead.

Thanks to their fallibility, heuristics may give rise to faulty thinking patterns known as cognitive biases. The follow-the-majority heuristic, for instance, corresponds to a cognitive bias called the bandwagon effect, describing humans’ tendency to follow the crowd (even if the crowd is going the wrong direction).

How Heuristics Spawn Fallacies

Arguments based on cognitive biases and heuristics often involve fallacies of irrelevant premises. Much like propaganda, these fallacies try to persuade by appealing to anything besides truth. The follow-the-majority heuristic, for example, leads to the fallacy called Appeal to the People, which claims a message is true because most people believe it. Let’s look at a few other common persuasion mechanisms, tracking the brains’ thinking from heuristic to fallacy:

  • Authority

    The authority heuristic states that experts usually know what they’re talking about it, so it’s reasonable to believe what they say. Certainly, this shortcut comes in handy much of the time. But even experts can believe wrong information, have cognitive biases, and are influenced by their worldview starting points. So, this heuristic leads to a cognitive bias dubbed the authority bias, describing humans’ tendency to be influenced by the opinions of authority figures—like professors—regardless of what they’re saying. However, reasoning that a message is true only because an authority figure said so is the Appeal to Authority fallacy.

  • Emotion

    The next heuristic is a bit of an emotional subject. It’s called the affect heuristic, describing our tendency to make snap decisions based on how something makes us feel. This shortcut gives rise to several cognitive biases including the halo effect, which refers to the brain’s knack for assuming that someone who excels in one area, like physical appearance, also excels in other areas, like intelligence or morality. So, messages can sound more reliable simply because they come from nice-looking people. You don’t need to watch many commercials to see how often advertisers exploit this effect, even though believing a message based only on who said it is a type of genetic fallacy.

  • Repetition

    Another mental shortcut called the fluency heuristic refers to our tendency to make snap decisions based on how quickly our brain can process the relevant information. Messages which are repeated often or expressed eloquently, like the evolutionary stories I heard in class, are often simpler for the brain to process. This helps the messages seem to “make sense.” Because messages which make sense are often true, the brain begins to associate processing ease with truth.

    The fluency heuristic can cause false messages to sound true because they’re repeated often, a cognitive bias known as illusionary truth. However, believing a message because it’s frequently repeated is a fallacy called ad Naseum, or Appeal to Repetition.

    Repeated messages (say, the message that humans evolved from apelike ancestors) sound especially true when they come from different sources3 —like museums, books, movies, television, and professors. And notably, research shows that university-age students may be especially susceptible to the power of repetition compared to older adults.4 That’s just one example of why Christian youth need connections with older adult mentors—not despite the fact that older adults may think differently than students, but precisely because of it.

Thinking about thinking

While heuristics and cognitive biases can lead to logical fallacies, research has shown that students who avoid cognitive biases tend to be better critical thinkers.5 In other words, being aware of these mental processes can help us resist falling for persuasive-sounding—but unbiblical—messages. When we understand how our minds work and fill them with the truth of God’s Word, we grow better equipped to counter the untruths of secular classrooms and culture.

For more on how to think critically about any faith-challenging message, stay tuned for future blog articles and my new video series, CT (Critical Thinking) Scan, available now on the AiG Canada YouTube channel and the AiG Canada Facebook page.

Footnotes

  1. See Critical Thinking Scan Episode 18, “Did Religious Beliefs Result from Evolution?”
  2. Bertram Gawronski and Laura A. Creighton, “Dual process theories,” In D. E. Carlston (Ed.), Oxford library of psychology. The Oxford handbook of social cognition (Oxford, UK: Oxford University Press, 2013), 282–312.
  3. Christian Unkelbach, “Reversing the truth effect: Learning the interpretation of processing fluency in judgments of truth,’ Journal of Experimental Psychology: Learning, Memory, and Cognition 33, no. 1 (2007): 219.
  4. Nadia M. Brashier, Sharda Umanath, Roberto Cabeza, Elizabeth J Marsh, “Competing cues: Older adults rely on knowledge in the face of fluency.” Psychology and Aging 32, no. 4 (2017): 331.
  5. Richard F. West, Maggie E. Toplak, and Keith E. Stanovich, “Heuristics and biases as measures of critical thinking: Associations with cognitive ability and thinking dispositions,” Journal of educational psychology 100, no. 4 (2008): 930.

Newsletter

Get the latest answers emailed to you.

Answers in Genesis is an apologetics ministry, dedicated to helping Christians defend their faith and proclaim the good news of Jesus Christ.

Learn more

  • Customer Service 800.778.3390