Yesterday we looked at a searing critique–from David French and Kierkegaard–about how Christianity can be corrupted by “Christendom”; that is, the faith’s institutional and cultural expression.
Baptist ethicist Andrew T. Walker recognizes the problems with cultural Christianity. But he argues that it’s not healthy for the church to become completely unmoored from the culture and that Christianity’s influence on the culture has been a good thing.
From his essay What We Lose in the Decline of Cultural Christianity:
From concepts like dignity, justice, and rights to the centrality of the family to the idea of life having an ultimate purpose—all of these have found unique expression in Western civilization as the result of Christianity. Even many non-Christian historians would agree with this analysis. Society requires some governing moral vision at its center.
What’s more, the naked public square will be harsh on many groups, not just Christians. To lament the decline of cultural Christianity is to lament not simply the loss of a Christian consensus, but the loss of the social capital born of common grace that secular society was borrowing from. Is it any surprise that a growing secularity is coinciding with the hollowing out of American civil society? When you define well-being in material terms only, it’s easy to miss that alongside growing secularism is a shrinking marriage rate, surges in addiction and suicide, and a whole new category we call the “loneliness epidemic.” As society sheds its Christian foundations, there will be a serious detriment to human flourishing. We should mourn this as Christians. We don’t want just the salvation of our neighbors but the good of society, too.
We shouldn’t ignore the costs of displacing even a cultural Christianity’s influence.
The decline of the Bible Belt will be met with a redefined common good that will be anything but good for millions of people. While I too have concerns about the ambient uses of “culture” attached to Christianity, it would be incredibly naïve and even disrespectful to ignore the ways that Western civilization has been influenced by Christianity. From the idea of the university to the hospital—and even basic charity—the Christian tradition forges principles that birth institutions, not only engaging the culture but transforming it.
Which side do you think makes the better case?
It seems to me that in some cases, Christianity influences the culture. And in other cases, the culture influences Christianity. The latter is where the most problems come from.
Also, Christians don’t always have a choice about their relationship to the culture. Sometimes the culture is hostile to them, which seems increasingly to be the case today. And influence is often unintentional, not so much a planned initiative, but, more often, something that happens of itself, when outsiders notice something they would like to emulate, sort of like, the use the Biblical metaphor, the way salt flavors food. And, of course, cultural influence on the church works the same way, to use another Biblical metaphor, like leaven permeating the dough (Matthew 16:6, 11-12).
Furthermore, the doctrines of vocation, the estates, and God’s providential reign over His temporal kingdom connect Christianity to culture and, yes, to institutions. Those are Lutheran concepts, but they have correlatives in other Christian traditions. There is a tension between Christians acting to love and serve their neighbors in the world, and the sinfulness of that world, indeed, a conflict between God’s reign and the usurpation of the Devil as the “prince of this world,” but God does not generally call us away from the arena.
Photo by form PxHere, CC0, Public Domain