Sometimes, false messages can sound true because they’re delivered by authority figures. A series of famous experiments in the 60’s revealed the persuasive power of authority, showing how far people will go against their convictions at an authority figure’s request, and reminding us that our highest authority is God.
Ever notice how even false messages can sound true when they come from an authoritative source?
The power of authority has long been a staple of propaganda techniques, which try to promote messages based on factors other than logic. Even pioneer propagandist Edward Bernays, who famously applied war-time propaganda techniques to the marketing and advertising world, harnessed authority figures’ sway. As this article about the history of propaganda explains, Bernays managed to popularize bacon and eggs in 20th century America by recruiting doctors to promote the “health benefits” of eating bacon for breakfast. Today, you can spot the same tactic in advertisements starring people with lab coats, clipboards, or intelligent-sounding manners of speech.1
As I’ve mentioned in earlier posts, such advertisements persuade by tapping into a decision-making shortcut known as the authority heuristic—the assumption that experts usually know what they’re talking about, so it’s reasonable to believe what they say. This mental “rule of thumb” comes in handy much of the time, but even experts are fallible. So, the authority heuristic spawns the authority bias, a faulty thinking pattern describing our tendency to be influenced by authority figures regardless of their messages’ content. However, believing a message only because an expert shared it is a logical error called the appeal to authority fallacy.
To glimpse how far the tidal force of authority can sweep people, let’s backtrack to the 1960s and step into the lab of the famous—and highly controversial—researcher, Stanley Milgram. His classic study2 went like this:
Imagine you see an ad saying that a researcher will pay you to take part in “a scientific study of memory and learning.” You show up at the lab, where you meet two individuals: a well-dressed experimenter and an unassuming, grandfatherly civilian. (Spoiler alert: while you think that Mr. Unassuming is a research subject just like you, he’s really an actor working for Milgram.)
The experimenter passes you and Mr. Unassuming an upside-down hat with two slips of paper inside, asking you to select a paper to determine your role in the study. Drawing one paper, you read the word teacher written on it. Unbeknownst to you, both papers said teacher, so that would have been your role either way. Mr. Unassuming, meanwhile, adopts the role of “learner.”
The experimenter leads you both into a room, where you watch as Mr. Unassuming is strapped into an electric shock generator. You’re then taken to a separate area where you can hear—but not see—the “learner.” There, the experimenter sits you in front of the control panel for the shock device, hands you a list of memory questions to ask Mr. Unassuming, and orders you to deliver increasingly strong shocks every time he answers incorrectly. (In reality, the shock device doesn’t generate electricity. But you don’t realize that.)
Mr. Unassuming protests the shocks. But if you say that you want to stop the study, the experimenter begins giving you four spoken prods, in this order: “Please continue; The experiment requires that you continue; It is absolutely essential that you continue; and You have no other choice, you must go on.”
If you’re like most people, you’d probably feel uncomfortable being commanded to hurt another person. But according to Milgram, nearly two thirds of participants in this study not only went ahead with it, but also proceeded all the way to delivering the final “450 volt” shock at the researcher’s demand.
This study understandably raised major ethical concerns, and various researchers have questioned other aspects of Milgram’s methods and interpretations as well. For instance, some participants likely realized the experiment was a hoax, and later review suggested that Milgram’s experimenter actually gave far more verbal commands than the four spoken prods alone.3 Even so, replications of these experiments over time have shown similar results, revealing that well over half of participants will apparently act against their consciences because an authority figure says to.4,5,6
Now, let’s think about these findings in the context of today’s culture, where authority figures (like university professors) may require Christians to behave or believe in ways which go against God’s Word. For example, while my professors and textbook authors never asked me to shock anyone, they did expect me to believe ideas which I knew weren’t right based on Scripture, like human evolution.7 The fact that these messages came from authority figures made them seem true. But a message’s source does not logically affect its truth—unless, of course, that source is 100% infallible.
God alone matches that description. He is the ultimate authority (Colossians 1:15–18), and we can trust his Word completely. Proverbs 30:5 states that every word of God is true; both Numbers 23:29 and Titus 1:2 assure us that God does not lie; and Romans 3:4 (ESV) declares, “let God be true, though every one were a liar.” Correspondingly, Acts 5:9 records that when authoritative human mandates conflicted with God’s Word, the apostles resolved, “We must obey God rather than man.”
In the end, while the power of authority helps us understand why unbiblical messages can sound persuasive, the truth reminds us that our highest authority is our Creator.