Should we be afraid of AI?
From pulpits to podcasts, scientists to suburban parents wondering if their fridge is listening—everyone’s talking about Artificial Intelligence. And for good reason. AI is no longer the stuff of sci-fi. It’s in your pocket. It writes your emails, predicts your shopping list—even your dating profile, if you're brave enough. Some hail it as a technological Prometheus: here to cure cancer, end poverty, maybe even upload us to immortality. Others see a Pandora’s box—poised to rewrite what it means to be human.
So how should we think about all this—should we be afraid of AI?
I’ve explored some of the bigger technical, philosophical, and theological concerns elsewhere. But I’m spending more time on it here—and will in future videos—because I sense a deeper need: to bring real-world urgency and public anxiety into the theological conversation. These are strange days, and Christianity has a unique voice that can speak into them without flinching or flattening.
So, let’s break it down in three short points:
• What our fear reveals
• What our desires project
• What the gospel reclaims
First, what our fear reveals.
Fear isn’t just a feeling—it’s a window revealing, not just the frightful things out there, but the fragile things in here: what we love, what we depend on, what we’re afraid to lose. And in that sense, fear doesn’t simply come as a response to danger—it comes from value. Our deepest fears orbit the things that matter most to us.
I think this helps explain the public reaction to AI. Because when people say they’re afraid of AI, they’re not all afraid of the same thing. The fears tend to come at two levels.
First, the ethical: the use and misuse of AI. How will human beings retain control over AI capabilities and alignment? And how will civil society ensure that such control remains democratic, rather than concentrated in the hands of an oligarchy? Think The Hunger Games or I, Robot—surveillance capitalism, algorithmic bias, job automation, misinformation, manipulation, and so on.
According to the AIAAIC—that’s the AI, Algorithmic, and Automation Incidents and Controversies initiative—there’s been a 35-fold increase in reported AI-related incidents since 2012, with nearly a quarter of those occurring in the last 18 months. We’re talking about everything from facial recognition leading to wrongful arrests, deepfake impersonations impacting wars and elections in real time, pricing algorithms exploiting users, even once case in Scotland where an AI camera mistook a soccer referee’s bald head for the ball.
Unlike previous technologies, AI isn’t neutral. It’s architected by human intelligence. And we’re still playing catch up with what that means—not just in how we regulate these tools, but in how we even talk about them, because we humans aren’t wholly transparent to ourselves. So, these ethical fears aren’t hysteria—they’re what happens when knowledge outpaces wisdom. When we hand powerful tools to people with almost no guardrails, not because AI is evil, but because we never stopped to ask: just because we can do something, should we do it?
But there is another level of fear that’s concerned with WHAT AI is. This is less ethics, more metaphysics, asking questions like: Can AI become conscious? Have intentions? Develop desires? Might it surpass us—not just in power, but in personhood? Movies like The Matrix or Blade Runner give voice to these anxieties, which seem less and less like fantasy and more and more like foresight. In fact, some experts warn that by the late 2020s, AI may surpass human intelligence across most domains. That’s a terrifying thought—but I think an even more terrifying question is: what does it say about us that we believe it could?
Whether our fear is ethical or existential, I don’t think it’s really about machines—it’s about meaning. Because fear, rightly traced, offers a kind of theological map revealing, not only the dangers out there, but what we treasure in here. We don’t just see the world—we see the world through what we love. And what we love shapes how we see ourselves, each other—and ultimately God.
Which leads to a second point: what our desires project.
If fear reveals what we’re afraid to lose, desire projects what we’re still struggling to find. Together, they form a kind of compass—one oriented by loss, the other by longing.
We’ve seen this dynamic throughout the history of technology—from hammers to harvesters, the tools of technology have extended human reach. But in every age, there was a clear line between the tool and the hand that held it. Function informed identity—A hammer hammers. A harvester harvests.
What makes AI a unique technology is that it doesn’t just do something—it seems to be something. After all: what does a “smart device” DO? “BE smart”? But what does that mean? An IQ test? An EQ test? A social test?
Francis Bacon warned centuries ago that the “mechanical arts” could cast a kind of spell—captivating us with their usefulness while reshaping how we see ourselves. If the only reference point for intelligence is humanity, then part of what makes AI such a unique advancement is that it’s not merely functional, it’s reflective. Less like a tool, more a mirror. But what do we find when we look into that black mirror?
When we start looking to the things we've made to understand ourselves, we do see something, but the reflection is always partial. As far as I’m aware, there is no coherent explanation in support of the idea that AI possesses understanding of its actions. It can write about love but can’t feel “longing”. It can detect a tumour but can’t “grieve”. It can finish your sentence—but not “your story”. Suppose someone says, “Just give it time”. Okay. But ask yourself: What does that assume about human intelligence?
If we say “Just give it time”—more data, more power—to bridge the gap between silicon and soul, then we are assuming that AI is a simpler version of us waiting to “level up”. But that way of thinking reflects a peculiar madness in our day: one that marvels at the painting, but forgets the painter. A mirror won’t turn into a person simply because you stare at it long enough! A calculator doesn’t know what “2 + 2” means—it just runs the code. AI systems don’t understand their outputs either. They shuffle symbols using rules written by humans, trained on data curated by humans, to produce results interpreted by humans. The danger isn’t that machines will develop values—it’s that we’ll pass along our own: values already hollowed out by consumerism, tribalism, and the economics of attention, so that AI becomes an automated parody of our worst impulses—optimised and unleashed at scale.
This is why I think Christianity must raise it’s voice in the AI conversation: because only the Christian vision grounds human dignity—not in intelligence, not in usefulness, not in output—but in the image of God. And that changes everything AI touches and reflects about us.
As impressive and powerful as AI is (and oh, it is!) we project a categorical confusion when we start trading it off against humanity. To talk about human rights for robots, as in the case of the chatbot LaMDA, or grant AI citizenship, as in the case of Sophia from Hanson Robotics in Saudi Arabia, we reveal just how deep this confusion runs—not with what AI can do, but what we assume it means when it does. To ask if a machine can become conscious is to forget that the machine only exists because conscious humans made it. That’s like saying, “Minds are like computers because computers are like minds.” It’s an incoherent thought circle.
Today, we’re not just making tools, we’re building mirrors. And in a world that’s lost much of its theological imagination, those mirrors tend to confuse a partial reflection for the whole reality. Slowly, subtly, what has happened is we've begun to adopt machine values—efficiency, output, control—as our own. Thinking becomes computation. Intimacy, a digital transaction. Creativity, mere pattern recognition. Presence, sheer availability. We don’t just misunderstand AI—we misremember ourselves.
Here’s the great irony: the more we try to make machines more like humans, the more we end up making humans more like machines. The issue isn’t that machines are gaining value—it’s that we’re forgetting our own. We’ve stopped asking what makes us valuable. Now we just ask if we’re still useful—“Men have become the tools of their tools” to quote Henry David Thoreau.
Maybe that’s the real anxiety beneath all the headlines and hype. Not the fear of what’s out there—but the ache of what’s missing in here.
Which leads us to a third point: what the Gospel reclaims.
Where fear reveals what we’re afraid to lose, and desire projects what we’re longing to find, the gospel of Jesus Christ reclaims it all—finding what’s lost, fulfilling what’s lacking.
Now I don’t mean to splash in the shallows of naïve optimism. I’m not dismissing the real dangers of AI—I’m trying to locate them. Like every tool, the dangers of AI are shaped by the hands that wield it, and those hands, in turn, are shaped by the hearts that guide them.
And this isn’t a new story. From Eden to Babel and beyond, humanity’s oldest mistake has never just been what we do—but who we’re becoming as we do it. We eat the fruit, build the tower, make the machine. But unchecked power without wisdom doesn’t make us gods—it only reveals how deeply we’ve forgotten we’re not. That’s the lesson of Babel in Genesis 11. It wasn’t about architecture—it was about arrogance: a people united not to glorify God, but to “make a name for ourselves.” The danger wasn’t in the bricks; it was in the hearts guiding the hands that laid them. And so it is today. AI excites the ancient temptation to be like God, without God. To create intelligence in our image and call it divine. To transcend mortal limits. Escape death. And build a kingdom without a cross.
But in the life, death and resurrection of Jesus, we discover that the way to build an everlasting kingdom is not with a tower of bricks but a cross of wood. The cross was not a glitch in the system—it’s the breaking and remaking of it. The gospel doesn’t help us outgrow our humanity. It redeems it. God doesn’t save us from being human. He saves us as human beings. Because our deepest problem isn’t computational—it’s moral. It’s not informational—it’s relational. And here’s the wonder of grace: while we were still climbing Babel’s steps, God came down. Not to upgrade our software—but to resurrect body and soul. That’s what the gospel reclaims—the whole reality.
Human value was never to be found looking in the mirror of your own partial reflection, but in the fullness of the One you were made to reflect. You are not defined by what you see in yourself, or what others see in you—but by God who saw you first, and said, “Very good.” And even when our pride says, ‘not good enough’, He doubles down, not just words of affirmation, but the priceless gift of His own life. At the cross, your value wasn’t just stated. It was spent. And heaven paid full price.
• AI can finish a sentence. But it cannot redeem your story. Jesus can.
• AI can scan your body for disease. But it cannot cleanse your heart from sin. Jesus will.
• AI can mirror your preferences. But it cannot shape your soul. Jesus can.
• AI can beat you at chess. But it cannot walk with you through suffering. Jesus does.
• AI can generate answers from human data. But it cannot give wisdom from heaven. Jesus is.
• AI can preserve your memory. But it cannot shatter your grave. Jesus already has.
The human heart does not need silicon. It needs a Savior. Not one made in your image—but One who made you in His.
So where does that leave us? It leaves us with a choice—not just about what to do with our technology, but what to do with our trust. Because fear, rightly placed, doesn’t need to paralyse us. It can anchor us, as we shift our eyes from what we can’t control to the Sovereign One.
AI is something of a cautionary tale in the making. But as we play our part in the story, we’d do well to avoid its twin temptations: to demonise it as a monster, or to deify it as a messiah. We don’t need to collapse into fear and sensationalism. But we must not bow in awe and adoration. AI is not the end of us. It is a tool—and like every tool, it reveals the heart that holds it.
It may expose what we truly believe about ourselves—let it:
• Let it surface your fears—so you can bring them to the One who casts out all fear.
• Let it highlight your limits—so you can run to the One for whom nothing is impossible.
• Let it stir your questions—so you can anchor them in the Word that never changes.
In the end, the greatest threat to your humanity is not a machine with artificial power—but a heart that’s forgotten its true source. The Christian hope is not post-human. It is post-death. Jesus didn’t rise from a server farm. He rose from a grave.