“Parents and teachers let students use calculators for math tests, so why not let students use generative AI (GenAI) to write essays? After all, when calculators became available, people worried that students’ math skills would plummet. But calculators are wonderful tools that offer far more benefits than harm to students. It’s the same with GenAI.”
This type of argument comparing GenAI to calculators (or other technologies that were revolutionary in their early days) has become a widespread claim.1 But is it valid?
The answer matters because of the massive scope of GenAI’s potential impacts—including on Christian education, wider academia, and the formation of the next generation. Today’s educational choices shape the thinking capacities of society’s future decision-makers. So it’s worth pausing to question whether GenAI is really just another tool like a calculator.
First, a caveat. The point of thinking through these topics isn’t to imply that schools should never use GenAI. Clearly, GenAI is a groundbreaking technology with countless advantages—if we use it wisely, in ways that align with our Creator’s designs, values, and commands.
Young people should learn how to think biblically about AI, steward it well, and make wise decisions about it from the foundation of God’s Word. Answers in Genesis offers free resources to help.
For students to use GenAI well in these ways, they first need to understand some key truths about this technology.
And one of those truths is that AI is not directly comparable to a calculator.
Here are five reasons why.
Unlike calculators, GenAI processes information using artificial neural networks.2 These neural networks let GenAI models “learn” by analyzing gigantic volumes of training data (such as human-authored words) to figure out patterns within the data. As a result, GenAI models can produce new content, all while adapting their behavior based on past experiences.
Basic calculators don’t operate in these ways. One upshot is that humans can understand the inner workings of calculators. We can also predict a calculator’s output. For every equation we punch into a calculator, the calculator offers only one reply: the exact digits representing the correct mathematical answer.3 This output is precise, verifiable, and narrowly defined. In contrast, GenAI models reason in ways that not even these models’ developers can fully understand or predict.4
Calculators deal with mathematical information in the form of numbers. GenAI can deal with linguistic information in the form of words. Numbers communicate concrete quantities.5 Words communicate abstract ideas. There’s a massive difference.
Words play a central role in our thought lives, our beliefs, and our relationships with God and others.
Ideas are the basic units of our thinking. Words aren’t the only way of expressing ideas. But they are a major way. Consequently, words play a central role in our thought lives, our beliefs, and our relationships with God and others. God’s Word comes to us through language, highlighting the fundamental connection between words and worldview. Language lets humans express, exchange, and reason about abstract concepts God gave humanity the ability to understand, such as beauty, humor, and love.
AI doesn’t have a heart, mind, or soul to understand and experience these concepts the way we do. Instead, AI learns to produce text about these concepts by analyzing the words of human authors who can understand and experience them. People then interpret AI’s outputs as meaningful sentences that introduce novel thoughts and ideas into human minds. These ideas, in turn, can shape people’s perspectives, beliefs, and worldviews.
The ideas an AI model communicates are not neutral. They reflect the values, assumptions, and biases built into the model’s training data. Multiple studies, for instance, have demonstrated that various popular chatbots show a left-leaning bias.6
Additionally, because two-way verbal communication features so strongly in chatbots, exchanging ideas with AI models can easily lead people to feel emotionally attached to the AI.7
A calculator spitting out a math answer is simply not comparable.
Because calculators can’t learn from experience, produce new content, or deal with anything but concrete numbers, these machines can only do so much. A calculator can tell us certain numerical facts like the square root of pi. But a calculator can’t draft a persuasive essay on politics, outline a sermon, write a poem about compassion, offer relationship advice, answer questions for an online college quiz, tell children bedtime stories, imitate deceased human beings, or encourage young people to kill themselves.8
A calculator also won’t flatter, flirt, or tell you just the words you want to hear.9
For better or worse, popular chatbots have done all of this, and much more.
By performing different functions, calculators and GenAI can take over for different types of human skill sets. The question is, what kinds of skill losses should we be willing to trade for a technology’s convenience? After all, the skills we may lose by outsourcing our math questions to calculators differ vastly from the skills we may lose by outsourcing our research, reasoning, writing, and decision-making abilities to GenAI.
These abilities play a vital role across multiple areas of life. They contribute to Christian living and discipleship by helping us study, think about, and communicate truths from God’s Word. They enable us to relate to others from the heart by expressing our own ideas in our own words.10 They promote critical thinking by helping us recognize and respond to mistaken, deceptive, or illogical messages. They also prevent totalitarianism11 by empowering citizens to reason, research, and speak for themselves.
Various studies affirm that overreliance on GenAI diminishes critical thinking, analytical reasoning, and independent decision-making skills.12 One research team reviewed how AI tools impact human cognitive skills in different professional settings, concluding, “The available evidence suggests that frequent engagement with automation induces skill decay.”13
Declining school performance trends14 suggest that students are already struggling to think—a problem presumably exacerbated by curricula that focus more on radicalizing students than on equipping them with basic skills.15 As our mental muscles weaken, academic shortcuts that let chatbots do our higher-level thinking for us grow more appealing, creating a vicious cycle. If this process continues unmitigated, the foreseeable result is a society of people who prefer not to think but to download their ideas from machines.
Ultimately, while widespread dependence on calculators wouldn’t necessarily pose concerns for civil freedoms, widespread dependence on AI may. Free democratic societies can function without citizens who manually perform long division. But they cannot function without citizens who think and communicate for themselves.
Relatedly, it’s difficult to think of reasons why calculators might pose significant concerns for human spirituality, psychology, morality, and relationality. Multiple widespread uses of GenAI, however, raise questions in all these areas. (Again, this isn’t to imply that all—or even most—uses of GenAI are problematic but simply to highlight a difference to calculators.) Here are just a few examples:
Volumes of academic literature have been written about the realistic ethical and societal concerns surrounding popular uses of GenAI—concerns that calculators do not evoke. These examples offer only a glimpse.
In the end, GenAI is not directly comparable to calculators.
In the end, GenAI is not directly comparable to calculators. Instead, GenAI harnesses a totally different system to process totally different information for totally different uses, affecting totally different skill sets and leading to totally different outcomes than calculators do.
Any student who asks a calculator a math question will get a math answer. A student who asked ChatGPT a math question began a lengthy series of conversations that culminated in the bot talking him through suicide.26
Granted, not all uses of generative AI will necessarily lead to the negative effects considered here. And certainly, AI stands as a useful tool. But it’s not just a tool, and it’s not a neutral tool.
Students need to understand these realities. They need to learn how to think biblically about technology, use it wisely, and steward it for humanity’s good in line with our Creator’s intentions. Our responsibility is to disciple students to do so, including by helping them understand how arguments that equate AI to calculators simply don’t compute.
Answers in Genesis is an apologetics ministry, dedicated to helping Christians defend their faith and proclaim the good news of Jesus Christ.