The Automatic Faith Project: How are MSU student religious groups using AI?

 

Bulldogs
The author asked Adobe Firefly to create an image of "a group of Christian bulldog mascots discussing the Bible with a maroon background." This was the result. 

By Madison Jones 

College students today are more plugged in than ever—but less religious than generations before them. So how do faith-based organizations, rooted in centuries-old traditions, reach students living in the age of AI? And when it comes to tools like ChatGPT, how do different religious groups on campus feel about mixing spiritual practice with artificial intelligence?

At Mississippi State University, Jewish and Christian organizations like Hillel and Fellowship of Christian Athletes (FCA) are both trying to offer students community, connection, and a deeper understanding of their faith. But when it comes to AI, their goals and concerns are shaped by the unique experiences and traditions of their religions. For some, AI might help explain a Bible verse; for others, it feels like the opposite of what faith is supposed to be.

Hillel 

Hillel, the Jewish Student Association at MSU, resides in a state where less than one percent of the population practices Judaism. Due to this low rate of exposure, Emma Noble, Hillel’s treasurer, says one of their core goals is to challenge preconceived notions about the Jewish community.

“A lot of what we do at Hillel is outreach, not just to our Jewish students, but to the community at large. I know I would say the majority of non-Jewish people that I spend time with, I am the first Jew they've ever met. And a lot of them either know absolutely nothing or know things that are just incorrect,” Noble said.

Halle Kranson, a Jewish student majoring in physical therapy, says she appreciates groups like Hillel as representation for Jewish people on college campuses and as an opportunity to connect with others who share her faith. She echoes Noble’s point about education and outreach.

“If people hear what we believe in, maybe they will be more accepting,” Halle says. “Hillel can't use AI to do so because it makes outreach more real and genuine when you learn it from real people.”

This makes Noble’s role on campus unique — she’s often the first person to introduce students to even the most basic elements of Judaism. Her goal is to offer a warm and humanizing first impression of the faith, not to guide people into becoming deeply observant followers. With that goal in mind, she says today’s version of AI doesn’t really have a “place” in Hillel’s operations.

“We try to make opportunities for college students to just come hang out and eat bagels or, you know, kind of just try, like, humanize us and expose people to Judaism. I really don’t think AI can make the connections our group is here to make,” Noble said. 

Halle also says she is cautious when using AI as a tool to practice her faith and appreciates Hillel’s caution in doing so.

“If AI were to generate a rabbi’s entire thoughts, I don't think it would be genuine. I don't think it would be a lesson that could relate to many people. I have never used AI in my practice because I don't see how an impersonal program could benefit me spiritually,” Halle said.

Noble agrees. She feels AI would not be helpful to Hillel’s religious practice due to the nuances of Judaism’s doctrine.

“There’s this concept in Judaism called a mitzvah—it translates to a good deed or a commandment, something we’re supposed to do. Learning about the Torah, for example, is a mitzvah. It benefits us personally, but it’s also good for the community. Replacing that with AI wouldn’t really reflect the spirit of the commandment. I can see use cases for individual needs—like for people with learning disabilities or limited time, AI could help summarize things. No judgment there. But for Hillel, I just can’t picture implementing it, because the process itself is important in our practice,” Noble said.

She also expresses concern about the reliability of AI sources when it comes to Jewish teachings.

“That is definitely one of my primary concerns about AI—is spreading additional misinformation. Not necessarily in a malicious way, but just because there are so many sources that have misinformation about Judaism, even when it passes as, you know, what might appear as a reputable source,” Noble said.

Fellowship of Christian Athletes 

Mississippi State University’s Fellowship of Christian Athletes (FCA) operates on a campus with 23,000 students, though fewer than 500 fall within their target audience of student-athletes. Justin Pigott, FCA Campus Director, focuses on creating a space where like-minded athletes can connect and learn how to hear from the “Holy Spirit”—something Pigott says is essential for truly practicing the faith.

“Being present with people, engaging with people, and hearing from the Lord when you're with people, that's a big thing. And so that’s the majority of our job,” Pigott said.

Madelyn Keating, a Cyber Security and Mathematics major on the Track and Field team, said she attends FCA to bond with like-minded students and grow in her faith.

“This group helps me socially, spiritually, and emotionally. I am able to navigate my emotions as a student-athlete and grow deeper in my faith with other FCA members and leaders,” Keating said.

Pigott explains that much of FCA’s focus is helping students develop a personal relationship with God—an experience he believes artificial intelligence can't replicate. 

“What we were passionate about, with the Monday times or with the individual times that we have with athletes or small groups, is teaching them how to hear from God themselves. So in order to do that, you've got to learn to be quiet. And so to me, it would go against the very core of what we are, and it could be a little conflicting,” Pigott said.

That said, FCA isn’t entirely closed off to AI. Pigott acknowledges its usefulness in planning discussions and generating thoughtful questions.

“AI gets you thinking--it has produced some really great questions for our huddles to where we get people thinking in discussion. If we're really having a hard time while preparing to communicate something and we're sitting on it, ChatGPT does a great job of kind of giving us a starting point. Or an outline.”

Keating agrees that AI has a place in FCA’s organization if it encourages discussion.

“I believe AI could be a good tool to answer questions, explain devotionals and the Bible. As long as it is being used interactively, I think it can be useful to FCA,” Keating said.

She even uses it regularly to study scripture.

“When I read the Bible, I ask ChatGPT to break down every part of the reading—then I take notes on what it says. I usually say something like, ‘break down Matthew 9:1-10 for dummies,’ and it tells me the deeper meaning of the verses,” Keating said.

Still, Pigott warns against overusing AI to the point where it strips away uniqueness—something he believes contradicts Christian values.

“But, yeah, I think in anything, my greatest fear is to become generic. I don't want to become generic. I don't think we're made in the image of God to be generic. The beauty of the body of Christ is extremely diverse. And nothing about the way heaven is described is generic either,” Pigott said.

Jewish and Christian student organizations have unique relationships with AI based upon their separate doctrine and individual group goals. For Jewish students at MSU, like those involved in Hillel, there’s a strong focus on representation, human connection, and protecting sacred tradition from being misunderstood or misrepresented. For Christians in FCA, AI might offer a helpful jumpstart to conversation—but not a replacement for the quiet, personal experience of hearing from God.

Ultimately, these organizations aren’t just about teaching doctrine. They’re about showing up. Whether it’s bagels or Bible studies, what matters most is the human connection—something AI might assist with but can’t replace.