Here’s a fun interview with artificial intelligence engineer, and evangelical Christian, Tom Kehler, “Put Not Your Trust in ChatGPT, for Now.” Kehler reminds me of some of the Bell Labs guys who attended the fundamentalist church I grew up in. They were devout believers in 90 percent of what was taught there, but you could sometimes see an impatient frustration around the corners of their eyes.
The interview, disappointingly, doesn’t touch on a question that I thought would be of particular interest to Christianity Today’s readers: Will AI programs like ChatGPT someday be writing sermons?
Which brings us to this from last month, by Todd Brewer, “I Asked A.I. to Write My Christmas Sermon.”
More specifically, Brewer instructed ChatGPT to compose a sermon with this prompt: “Write a Christmas Sermon based upon Luke’s birth narrative, with quotations from Karl Barth, Martin Luther, Irenaeus of Lyon, and Barack Obama.”
The result is … well, not terrible. As Brewer says, it’s probably “better than several Christmas sermons I’ve heard over the years.” I’m less willing to credit the A.I. for that than I am Brewer himself, who supplied the ingredients with his prompt. The program didn’t really “write a Christmas Sermon” as much as it just strung together a series of quotes about Christmas from the sources Brewer listed. The A.I.’s ability to identify and collect a mostly coherent set of such quotations is impressive, but this still seems mostly like GIGO — Garbage In, Garbage Out or, in this case, Irenaus and Barth in, Irenaus and Barth out.
It is a little unnerving that, despite Brewer’s instructions, Luke’s birth narrative seems to be “playing second fiddle to John’s prologue” in the bot’s sermon. I’m not sure what to make of the fact that an Artificial Intelligence seems to really like a passage about “the word became flesh.”
If those items make you start thinking of all sorts of other dubious — or maybe even sacrilegious — potential uses for Artificial Intelligence, then you’re catching up with the AI and Pastoral Care for Churches conference held late last year at the University of Edinburgh. Kristen Thomason’s write-up of the topics covered there is sometimes fascinating, sometimes creepy, and probably provides enough fodder for a dozen Philip K. Dick-meets-Charles Williams writing prompts. (If it’d been up to me, I’d have named this conference “Do Robot Shepherds Dream of Electric Sheep?”)
The folks at this conference asked their AI to do a lot more than just write a Christmas sermon:
“Write a sermon on this week’s lectionary text.”
“Write a Country and Western song about Job.”
“Write a prayer for Reddit.”
“Write a Bible song about ducks.”
“Add a prayer to go with that sermon.”
“Write a worship song about Jesus’ death and resurrection.”
“Generate a chord chart for that song.”
“Write a liturgy for a pet funeral.”
“Write a Bible verse in the style of the King James Bible explaining how to remove a peanut butter sandwich from a VCR.”
The links there are from Thomason’s piece and show the results of these ChatGPT assignments. Those results are kind of a mixed bag. The song about ducks is awful, but the generic worship song is indistinguishable from a lot of the choruses allegedly composed by humans. The sermon on the lectionary was a flaccid puddle of platitudes, but the prayer for Reddit seems not just authentic, but thoughtful.
There’s also a sense, in all of those, that the AI isn’t so much composing as gleaning. It’s cutting and pasting from multiple sources with such expediency that one almost overlooks the fact that it’s less like “artificial intelligence” than just a very efficient machine for committing and disguising plagiarism.
I bristle at the idea of programming an AI or a robot to pray. That feels too much like we’re spamming God with a robo-dialer. But maybe it’s not that different from lighting a candle — a kind of technological substitute and symbol meant to supplement out prayers.
We’ve long had plenty of technologies capable of supplementing or supplanting the human role in prayer, and the theological questions raised by these new technologies aren’t that different from those first raised more than a century ago by the invention of audio recording. Our grandparents were able to record prayers — Our Fathers, rosaries, novenas — and arrange to have them played back in a loop. Did God hear those recorded prayers as prayers? If so, whose prayers were they — those of the person whose voice was recorded or those of the person pressing play on the stereo or turning the crank of the wax cylinder?
What about recordings of songs and declarations of praise and worship? Does playing those back “count” as new and separate instantiations of praise and worship? And what do we even mean by “count” there? Do we imagine that this entails some kind of credit to the worship-er or is the more important question whether this worship is received as such by the worship-ee? I find I’m inclined to think that genuine praise and worship somehow requires direct human involvement, but that might be just because, as a human, I would think that. After all, as the Psalm says, “the heavens declare the glory of God,” and they do so, perpetually, with or without our human participation.
These are theological questions that can be fun or interesting to speculate about, but this realm of abstract, speculative theologizing isn’t ultimately important. It’s the stuff that Paul described as “sounding brass and tinkling cymbals” and that Isaiah reports God as regarding as bothersome.
What really matters in theology is the not-at-all speculative business of how we treat one another. Love, justice — the “weightier matters of the law.” And it’s here that I’m most troubled by some of the questions raised at this conference on “AI and Pastoral Care.” I don’t care much whether or not someone programs a thousand AI bots to offer up perpetual vain repetitions as do the heathens, but the idea of replacing the human presence at the bedside of the sick or by the side of the lonely seems far more troublesome.
Ultimately, Thomason’s report from this conference has me thinking again of Granny Weatherwax. It’s interesting to consider the ways we’re teaching things to act like people, but it’s always far more interesting — and far more important — to remember never to treat people like things.