This piece was adapted from Russell Moore’s newsletter. Subscribe here.
In the past several weeks, two events occurred that are going to change our futures. One of them was the launching of OpenAI’s new artificial intelligence program, GPT-4o, just ahead of several competitors who will do the same in a matter of weeks. The other was the defrocking of a robot priest for teaching that baptisms could be done with Gatorade. I’m afraid the church is not ready for either.
The more talked-about happening was the OpenAI announcement, complete with videos of the AI program laughing, seeming to blush, telling jokes, seeing and describing things in real time, and even singing songs made up on the spot (to whatever degree of emotion and enthusiasm was demanded).
Far less culturally noticed was the fact that just a few weeks before, the Roman Catholic apologetics platform Catholic Answers reined in an AI chatbot called “Father Justin,” which was designed to help people through questions of doctrine and practice.
People started to get upset when Father Justin started claiming to be an actual priest, capable of hearing confession and offering sacraments, and when it started giving unorthodox answers to questions, such as whether baptizing a baby with Gatorade would be all right in an emergency (the magisterium says no).
Now Father Justin is just “Justin,” a “lay theologian.” Catholic Answers acknowledged to critics that they are pioneering a new technological landscape and learning—as the whole world will—just how difficult it is to keep an artificial intelligence orthodox. If my Catholic friends thought Martin Luther was bad, wait until the robots start posting theses to the cloud.
Before one laughs at Catholic Answers, though, one should think about the now-quoted-to-the-point-of-cliché anecdote of 19th-century preacher D. L. Moody’s response to a critic of his evangelistic practices: “But I like my way of doing it better than your way of not doing it.” Behind the scenes, almost every forward-thinking ministry of any kind is worried about how to be ready for an AI-transformed world, imagining what it would have been like if Luther had not been ready for a Gutenberg era or if Billy Graham had not been ready for a television age.
One AI expert told me recently that he and others are realizing that people will say to an AI what they would never admit to a human being. Doctors know, for example, that when asking a patient, “How much do you drink each week?” they will get one answer from a potential problem-drinker while a chatbot will get what’s much closer to an honest answer.
The same is true when it comes to spiritual searching, this expert said. The person who would never ask a Christian person, “What will happen to me when I die?” or “Why do I feel so guilty and ashamed?” is far more likely to ask such questions to an intelligence that’s not another person. In some ways, that sounds oddly close to Nicodemus, who came to ask questions of Jesus at night (John 3:1–2).
“The question is not whether people will be searching out chatbots for big questions like that,” the expert told me. “The question will be whether the only answers they get are spiritually wrong.”
The real challenge may prove to be not so much whether the church can advance fast enough to see an artificial intelligence world as a mission field—rather, it’s if it will be ready for the conflicted emotionality we noticed even in most of our responses to the OpenAI announcement videos themselves.
The videos provoked for many people an almost moon landing–level of wonder. As I said to my wife, “Watch this. Can you believe how it tutors this kid on a geometry problem?” I realized that, one day, my reaction would feel as “bless your heart” naive as the old videos of television anchors debating each other on how to pronounce the “@” symbol in the then-new technology called email.
At the same time, though, the videos kind of creeped a lot of us out. The vague feeling of unease is described by psychologists as “uncanny valley.” It’s the reason lots of people would be terrified to be trapped inside a doll-head factory or in a storage shed filled with mannequins. Human beings tend to respond with dread to something that’s close enough to seem lifelike but doesn’t quite get there. Something our brain wants to read as both “human” and “non-human” or as both “alive” and “dead” tends to throw our limbic systems off-kilter.
Print and radio and television and digital media have their effects on the communication of the gospel, as Marshall McLuhan and Neil Postman warned us. But what those media retained in common with oral proclamation was a connection, however tenuous, to the personal. One might not know who wrote a gospel tract one finds in the street, but one does know there’s a human being somewhere out there on the other side of it.
On the one hand, I am almost persuaded by the argument that one could put AI in the same category as the quill Paul used to pen his epistles or the sources Luke compiled to write his gospel. AI programs are designed by human beings, and the Word of God comes with power regardless of the format.
Even so, that doesn’t seem to be the whole story. Do people experience the “uncanny valley” unease here just because it’s a new technology to which we’re not yet accustomed? Maybe. Or maybe there’s more to it.
A few weeks ago, the Sketchy Sermons Instagram account featured a cartoon rendering of a quote from the comedian Jaron Myers: “I’ve seen too many youth pastors be like ‘Be careful on TikTok, it’s just girls dancing in swimsuits’ and I’m like bro … It’s an algorithm.”
The joke works because we live now in an ecosystem where everything seems hyper-personalized. The algorithms seem to know where a person’s heart is better than that person’s pastor or that person’s spouse or even that person’s own heart. If you like knitting content, you see knitting content. If you like baby sloth videos, you see baby sloth videos. And if you like bikini-dancing—or conspiracy theories or smoking pot—you get that content too.
That hyper-personalization is ironically the very reason this era seems so impersonal. Even if a machine seems to know you, you can’t help but realize that what it knows is how to market to you.
The gospel, though, cannot be experienced as anything but personal. If the Word of God is breathed out by the very Spirit of Christ (1 Pet. 1:11), then when we hear it, we hear not just “content” or “information” or disconnected data curated by our curiosities and appetites. We hear him.
How does one convey that in a world where people wonder whether what they are hearing is just the inputs from their own digital lives, collected and then pitched back to them?
That so many are queasy when they see a friendly, helpful, seemingly omniscient AI might tell us something about ourselves. Despite the caricature, philosopher Leon Kass never said that “the wisdom of repugnance” is an argument, for or against anything. What he wrote was that when we feel some sort of revulsion, we should ask why. Sometimes it’s just cultural conditioning or the fear of the unknown—but sometimes it’s “the emotional expression of deep wisdom, beyond reason’s power fully to articulate.”
Should we conclude that God is able from these chatbots to raise up children for Abraham? How do we make sure that, when people are thirsting for living water, we do not give them Gatorade?
What I do know is that no new technology can overcome one of the oldest technologies of them all: that of a shepherd leading a flock with his voice. Yea, though we walk through the uncanny valley of the shadow of data, we should fear no evil. At the same time, we have to be ready for a very different future, and I’m not sure we are.
Russell Moore is the editor in chief at Christianity Today and leads its Public Theology Project.
To unlock this article for your friends, use any of the social share buttons on our site, or simply copy the link below.
To share this article with your friends, use any of the social share buttons on our site, or simply copy the link below.
source
Post comments (0)