Cans and Shoulds
One of the great parts about my job with the Newbigin Fellowship is getting to know people who have their hands in the projects that are shaping the world of tomorrow. An engineer going through the program talks about how hard it is to get people to even ask questions such as, What is this going to be used for? and Should we be building this?
Technology is seen as its own inherent good. More is better. And from what I hear, AI is the best thing going right now, so if you want venture capital you’d better throw some AI into your project.
But power is never inherently good. It is only power, waiting to be deployed. And for now at least, humans are the agents of that deployment.
My great concern is this: at the worst possible time we are, generally speaking, people with underdeveloped capacity for moral discernment.
For many of us, the fact that we can technologically do something is sufficient for us to say that we should (or, at least, to not say that we shouldn’t). For some, the “new” is always an imagined horror, but one that wanes after half a generation or so such that their kids or communities will be using the vilified technology to vilify the next before we know it.
As Americans, most of what passes for new technology enriches our country and our retirement portfolios. It is deployed by our military on the other side of the globe (for our “defense,” mind you) so we never see it and “it keeps us safe.”
It is hard to question deployments of power that increase our own. Power is insidious like that.
We need to start asking (better) questions. We need to start giving (better) reasons. Those of us who identify as Christians would do well to take some time out from our various sound-bite worlds to nurture our ethical processes.
Many of us who grew up in Christian church also went through a litany of Christian education that did not train us well to deal with ethical questions with nuance. I love the Bible, but a commitment to Scripture as our guide to life leaves us with underdeveloped ethical imaginations when it comes to the morality of a self-driving car or swarming drones.
We probably need rules. But we need more than rules. We can always find our way around rules in our moral reasoning. (Cf. the commandment against murder and the nearly ubiquitous unquestioned loyalty by U.S. Christians to our military-industrial complex).
We probably need principles. Principles have a flexibility that rules often lack. That is needed.
But we need principles that are more than simply rules in a different guise. We need principles that are articulations of the better future we are hoping to attain and that articulate outcomes we value.
Visions of the future. We Christians should be great with this. No, I’m not talking about Revelation and mass-marketed Left Behind books. That’s not a future conducive to the moral imagination. Nor is it a good reading of the Bible.
We need to know where we hope to go, what kind of world we hope to create, and wisely read the times to know what it required to get us there. We also need to have the courage to name the things in the current world that do not measure up, even when they conduce to our own good.
What’s the Future?
So here are some questions for you.
How would you describe the future that we should be striving for? What would be the markers that we were attain that future? And what sorts of projects would it make us leave behind?
And here’s another line of thought: can a vision that is shaped by core religious convictions be good for the whole world, including those who do not share the convictions?
Where is better moral reasoning needed in your neighborhood, school, or industry? What questions need to be asked and what sorts of answers are you hoping will be given?
I hope we can answer these kinds of questions together. Because the only cure for anthrophobia is people working together for a better good.