Skip to Content, Navigation, or Footer.
You are the voice. We are the echo.
The Echo
Taylor University, Upland, IN
Sunday, April 13, 2025
The Echo

A Christian view of AI

Technology is a tool, not a savior

Hundreds of people used ChatGPT to turn their favorite pictures into Studio Ghibli inspired art last month, causing X to go crazy. The usual controversy followed: was AI committing plagiarism by generating images in someone else’s style? 

Sonny, a part human, part AI counseling chatbot is used by 4,500 students across nine school districts, according to  World Magazine. It’s designed to let students chat with it about anything. 

A megachurch pastor launched an AI version of himself that people could pay to download. It’s designed to give people a personal (AI!) pastor they can talk to or ask for prayer. AI can also generate music, write a paper, or create videos without much human participation. 

The Barna institute reports Christians, for the most part, are wary of AI and its use in the church. But AI seems like it’s here to stay and Christians need to know how to interact with it. 

Arthur White, a professor of computer science and engineering at Taylor University, said the best way to look at AI is as a useful and powerful tool. It’s helpful for doing unpleasant repetitive tasks, but not something to look to for hope because it’s drawing from a fallen source. 

“AI is not a savior. …think of it as a compilation of all... these human writings and musings over the years and stuff,” he said. “And essentially, it's pulling what it can out of that. And so when you think about that, you think about the source that it's pulling from. It's a fallen source.”

AI doesn’t have a moral foundation apart from what it’s programmed with. 

ChatGPT said, “My ‘moral framework’ is influenced by principles from philosophy, ethics, and the guidelines set by my creators.”

How does the machine define ethics or philosophy? AI’s worldview is not necessarily a biblical worldview, and its idea of ethics is not anchored in Scripture. There are certain things AI should not be used for. 

John Denning, chair for the department of computer science and engineering, said AI should not be making moral decisions, such as deciding how long someone should be incarcerated or whether someone gets their health claim.  

“The business [might] be able to churn out more people and optimize toward profit, right?” He said. “But is that in the interests of the society? Is that given too much power?”

Denning pointed out that ChatGPT is also really good at flattery and giving the impression of emotion. Krista Ryndak is a biblical counselor and she said people are using ChatGPT as a way to cope with mental health issues instead of finding a real-life friend. 

“Another interesting popular coping tool these days is ChatGPT,” she said, “So instead of doing the work of going out and finding a real friend, a lot of people are using ChatGPT or AI related things as substitute friends. Because that friend won’t reject you, that friend will speak positive affirmations to you.” 

God didn’t create people to cope with mental health using a computer, especially a computer that doesn’t have a biblical understanding of the world.  

So, Christians should use AI – but use it cautiously. Stephen Brandle, a professor of computer science and engineering at Taylor University, pointed out that AI is also being used to speed up Bible translation. 

Realize it’s a tool and like any other tool, can be used for good or for evil. 

“Just because something can be misused doesn't mean that you shouldn't use it at all,” Brandle said. “The correct answer, I think, is to use it. But thoughtfully.”