large server room at a data center

Excessive consumption causes harm. This includes using AI.

Can we conceive of a world in which we use artificial intelligence in moderation?
Peace & Justice

Data centers are causing controversy across America and around the world.

In early December 2025, my county council in Indiana voted on a rezoning that would allow a new data center to occupy more than 1,000 acres of prime Midwest farmland, just a few miles from another 1,200-acre project that Amazon Web Services has recently completed. It’s difficult for local communities, especially rural ones, to resist the lure of immediate construction jobs.

But data centers require few employees once up and running, so once the construction jobs go away, residents may end up losing not only land, but also the basics of living on it. In addition to the enormous expansion of industrial land use, the centers register significant demand for water and electricity. And my community recognized this: After a marathon meeting that lasted until 4 a.m., the rezoning was voted down 7 to 2.

As Pope Francis quotes his predecessor, Pope Benedict XVI, in the encyclical Laudato Si’ (On Care for Our Common Home), “the external deserts in the world are growing, because the internal deserts have become so vast.” The hunger for ever more technological progress isn’t just an environmental problem; Francis says it is a “summons to profound interior conversion.” Such challenges ultimately rest on the question of how exactly we use the artificial intelligence powered by these centers.

Advertisement

Right now, some of us experience AI similarly to how we engaged with the internet in its early days—we sometimes intentionally choose to use it to complete a task or find information. For others, it’s more everpresent. Elise Ureneck recently wrote for Angelus News about the ubiquity of relationships with chatbots. She even talks about a wearable AI “friend.” Worn as a necklace, it listens to your conversation, presumably then filling you in on what it “thinks” you’d like to know. Perhaps some people will resist this, as they did the perceived eavesdropping of products such as Alexa. But it’s also possible that such “companions” will become necessary interfaces for engaging with the world, like the smartphones that we now must carry around to accomplish many things.

Ureneck argues that the church “should prohibit any engagement with AI as if it were a human. The stakes are just too high.” The sentiment is powerful, but I’m not sure I agree. I have a friend who uses ChatGPT for German language acquisition. She says it is an outstanding tool for this purpose. Is she using the site as a replacement for a human tutor or a German-speaking lunch table at a university? Surely she is, but interacting with AI as if it were a human in this scenario is likely less anxiety inducing than Ureneck’s examples of lonely people using AI for friendship or romance.

A better way to consider AI limits is to draw a distinction, made by political scientist Joshua Mitchell, between supplements and substitutes. Mitchell cites many such examples: Fast food can quell our hunger occasionally but shouldn’t substitute for balanced, nutritious meals on a regular basis; virtual meetings and classes are useful, but can’t fully replicate what’s gained from in-person gatherings. Something can be good as a supplement but disastrous as a substitute. In the long run and collectively, AI as a substitute for human intellect and relationships sacrifices both individual competence and communal connections.

Can we conceive of a world in which we use AI in moderation? It is not so much a yes or no question as a question about where we place our limits. Twenty years ago, the environmentalist Bill McKibben said the fallacy of our approach to economic and cultural life is that we are like the person who thinks two drinks make us feel good, so 10 drinks will make us feel five times better. The reality is that excessive consumption causes harm.

Advertisement
ad promoting El momento Catolico
Advertisement

In Laudato Si’, Francis says that choices about technologies are really choices about the kind of society we want to build. Are we choosing an AI-saturated world, in which AI becomes our omnipresent “friend”? Or can we choose to have the amazing document-sorting or cancer-detecting abilities of AI but reject the need to rely on it for daily accompaniment?

The financial forces that have invested in AI need the higher level of return. It’s a familiar pattern: Some companies start off with a mission for good, but eventually huge investments necessitate an equally huge monetization of the product, including a monopoly.

OpenAI, parent of ChatGPT, started with a nonprofit board that was committed to ethical AI (i.e., AI with limits), prioritizing benefit to humanity over profit. Yet in 2025, it completed a transition to a more conventional for-profit governance structure. Why? To attract the best talent and investors by maximizing returns of an anticipated future IPO of stock. The investments in such corporations, their chip suppliers, and their builders mean they need AI to be everywhere, because then (maybe) it can be monetized and generate a return.

We cannot expect that big corporations will manage this moderation. The temptation to go all in, backed by mantras of consumer freedom and future magical wonders, is too strong. Moderation will come about by thinking in terms of the classic Catholic position of subsidiarity—by setting limits on where we allow AI to enter.

Advertisement

Subsidiarity names the principle that individuals and local groups can maintain their own authority and integrity. That’s what happens when local communities make their own choices about land use. But it’s also involved in the design of products themselves. For example, surveys indicate that a majority of users do not want AI enabled, and yet Google and Apple increasingly default to AI, requiring intentional workarounds to disable it.

We need to learn when to say no to AI. But to do that, we need to determine when it is a useful supplement and when it is an unnecessary substitute for learning or companionship that just wastes time and resources, covering an interior emptiness that only community with living people and the living God can fill.


This article also appears in the March 2026 issue of U.S. Catholic (Vol. 91, No. 3, page 40-41). Click here to subscribe to the magazine.

Advertisement
ad promoting the National Shrine of St. Jude
orange ad promoting U.S. Catholic's upcoming redesign

About the author

David Cloutier

David Cloutier is professor of theology at the University of Notre Dame, with a concurrent appointment in the business, ethics, and society program of the Mendoza College of Business.

Add comment