There’s an algorithm for that? Really?
Skeptics and humanists are good at attacking religious concepts of humanity, society and the universe, and describing the way these ideas dehumanize and denigrate us. But we should also be able to call out dehumanizing and dangerous rhetoric of the secular sort. I’m talking about the offhand way we use metaphors that compare us to machines. We also have a tendency to describe our modes of inquiry in technological terms, and talk about nature and reality as if they’re engineering projects.
The Machine of Nature
During the Scientific Revolution, scientists and philosophers modeled the universe on the example of the clock: a complex invention that ran according to well-understood principles. As so often happens in history, the metaphor became reality. Scientists defined phenomena by how they work, and what function they serve. Not only that, but the machine metaphor defined the process of inquiry itself: to understand something, you need to open it up and see how its parts work. A reductionist approach denies that the whole is greater than the sum of its parts; the point is that the whole is nothing more than the sum of its constituent elements.
The irony is that, scientifically speaking, the mechanical universe is as obsolete as the whale bone corset. Darwin, Einstein, and Freud showed us a universe of contingency and indeterminacy, and did away with the knowable, predictable, ordered reality of the Enlightenment. So why does the dusty old machine metaphor still resonate with us?
It’s a fantasy about control. Humanity has been at the mercy of Nature’s often dangerous whims for so long that we’ve devised methods to make us feel like we’re the ones in charge. But just as praying and ritual behavior gave us the illusion of power, making reality fit our models does nothing except reinforce our delusions of dominance. This delusion has consequences for ourselves and the environment. During the Industrial Revolution, promulgating the scientific concept of Nature as an inert machine was necessary to overcome superstitious beliefs about the consequences of exploiting natural resources for gain. What a deal: all the dominance with none of the responsibility.
Algorithms All the Way Down
Scientific inquiry itself is frequently conceptualized as a machine, a tool with which humanity studies phenomena. Considering the way scientists like Richard Dawkins and Lawrence Krauss talk about science taming Time and Space and decoding the universe, you’d think this idea ennobles humanity. However, there’s an anti-human undercurrent to this kind of rhetoric too. It conceptualizes scientific inquiry as something algorithmic, an automatic process that eliminates human error, but doesn’t acknowledge the personal, cultural, and political messiness of this human endeavor. Evidence goes into the machine, and Truth comes out.This is another fantasy that denies the human aspect of empirical inquiry. It’s no easier to remove humanity, with all its biases and interests, from the process of empirical inquiry than it is to remove it from the way we conceptualize government or the economy. Science’s “self-correcting” mechanism is a common claim, but it’s magical thinking.
The Soft Machine
In the comments section of my previous post, plenty of commenters expressed amusement or outrage at the implication that there’s more to human cognition and consciousness than neuroprocessing: one said, Our brains are meat computers, I’ve seen nothing to indicate that this isn’t the case. Another alleged, There is no ghost in the machine.
Well, that’s true. Because there’s no machine.
Talking about humans in engineering terms is the dictionary definition of dehumanization. In one sense, it’s typical of the anti-intellectualism of com-box discourse. Talking about the linguistic and cultural contexts of how we conceptualize and interpret human experience is an incredibly complex challenge, and it’s easier to spit out borrowed rhetoric with sciencey-words and pretend you’re dealing with the phenomenon in a sincere way. What’s truly disturbing, though, is that people regurgitate this anti-human numbnuttery because it relieves us of responsibility for our beliefs, behavior, and societies. Don’t blame us, we’re just data processing! As philosopher Stanley Cavell used to say, “Nothing is more human than the wish to deny one’s humanity.”
Furthermore, this shows how infatuated we can get with our own metaphors. Dawkins calls us gene machines, and DNA is so frequently referred to as digital code or a program that we’re unable to approach biology and evolution as anything more than engineering. Once again, when we make reality fit our models, that’s called pseudoscience.
What do you think? Do machine metaphors truly help us understand ourselves and reality, or do they just pander to our need for control and our urge to deny responsibility?