Science fiction often gives us a sense of the future, as a Complex article notes. Britain’s Ministry of Defence alluded to science fiction’s The Terminator in its discussion of drone warfare back in 2011, according to the UK’s Guardian. As noted in The Guardian,
The report warns of the dangers of an “incremental and involuntary journey towards a Terminator-like reality”, referring to James Cameron’s 1984 movie, in which humans are hunted by robotic killing machines…“It is essential that before unmanned systems become ubiquitous (if it is not already too late) … we ensure that, by removing some of the horror, or at least keeping it at a distance, we do not risk losing our controlling humanity and make war more likely,” warns the report, titled The UK Approach to Unmanned Aircraft Systems.
Among other things set forth here, such as how drone warfare may make war more prevalent, the report raises concerns over the ethical implications for operators (given that machines are not able to address ethical considerations) and the psychological impact on remote warriors (another article addressing the moral and psychological impact on operators is found in The New York Times titled “Drones, Ethics and the Armchair Soldier”). In addition, Britain’s Ministry of Defence notes that for some, the killing of one’s enemies must involve an element of self-sacrifice, or at the very least risk to one’s own person, if war is to be moral, and not simply legal.
Further to protests mentioned in the article by nations where drone attacks have occurred, what does the use of drones in taking out targets in those nations suggest of the value placed on those peoples and nations by the governments of countries responsible for such strikes? The United States is one such country. As noted in a Christian Science Monitor opinion piece, the United States must not only concern itself with its own security, but also be concerned for how its national security might lead to “perpetual insecurity for others.” Moreover, what will the U.S. say when other countries follow their lead and send drones to take out terrorist suspects in other lands, as a PBS article observes? Furthermore, does America have the right to invade sovereign nations’ airspace (the word “invasion” entails entry against their stated will), when we are not at war with them, but rather with terrorist cells holing up within their borders (against the national will), to perform air strikes against those cells? Would the U.S. allow for such intrusions on the part of foreign governments on American soil to wipe out threats to those countries’ own national security?
None of these concerns are fictional or hypothetical, even if one alludes to science fiction (like the Terminator reference in the report by Britain’s Ministry of Defence noted above) in presenting one’s argument. Still, not everyone appreciates the appeal to science fiction in discerning how best to proceed on the drone warfare front. In “The Science Fiction of Drone-phobia,” Joshua Foust argues, “Many opponents of autonomous robots in warfare appeal to science fiction to make their case. This is not only lazy thinking, it distracts from the bigger questions about regulation and ethical uses.” He takes special aim at The International Committee for Robot Arms Control (reference his own exchange with one of its members; a link is embedded in his article). Foust does not believe autonomous robots will spawn apocalyptic horrors, as science fiction may at times project. Among other things, Foust argues that robots can undergo total systems upgrades, when errors occur in combat; humans do not have such ability for systems upgrades and can actually break down beyond repair psychologically under extreme duress in combat. Moreover, concerning the objection that a proper combatant must experience emotional connections (with the assumption that drones do not), it is already the case that robots are being developed to form emotional bonds with humans; although some will discount such emotional bonds as sheer simulation, nonetheless, these emotional connections are real to the robots in question.
Foust seeks to turn the appeal to science fiction—in pursuit of a ban on the use of robots in combat— on its head. He notes that in Terminator 2, Schwarzenegger’s character T-101 was reprogrammed so that he might come to “understand and defend humanity.” Robots developed to operate autonomously “to follow the same Rules of Engagement” and “with the same base-level emotional desire not to wantonly kill other humans” can serve to limit the number of human casualties, including innocent human civilians.
Consideration of such limitations is vitally important. According to a 2011 Economist article on the ethics of drone usage in warfare, a country such as the United States must be able to demonstrate that any attack is within its right to self-defense and that the attack is proportionate. With such points in mind, the article claims that it is vitally important that responsibility for armed drones flying over places like Yemen and Pakistan be transferred from the CIA to the U.S. military given the former’s level of secrecy and the latter’s greater level of accountability.
The presumed pros and cons suggest how difficult it is to address this topic well. On the one hand, there are the presumed or apparent pros: Scott Shane of The New York Times points out that some weapons specialists, moral philosophers and political scientists contend that drone warfare presents marked ethical advances; the use of drones in combat demonstrate an increased precision in identifying and removing targeted danger (such as terrorists) and limiting human casualties. On the other hand, there are the presumed or apparent cons: the question is raised in Shane’s report: does drone warfare “threaten to lower the threshold for lethal violence”?
Foust takes issue with the argument of opponents to autonomous drone warfare that “war must be kept brutal to keep it distasteful and rare.” He finds such reasoning an “astonishingly brutal moral calculus.” Perhaps Foust is right about how problematic such a brutal calculus is, even though it does not follow from his concern that we should therefore employ drones for combat. Still, drones or no drones, if Foust is correct about the brutal moral calculus, we need to ask ourselves what such ‘moral’ reasoning says about those of us who belong to Western democratic states? How far removed from barbarism are we, if we require the threat of brutality to limit warfare? What we promote and tolerate as a society in warfare says a great deal about us. As in the particular case of drone warfare, the moral dilemma does not ultimately rest with the soldier pushing the button or moving the control remotely, but with those further behind the scenes. Here I call to mind Hannah Arendt’s chronicling of the Nazi leader Eichmann on trial before the Israeli Court in Jerusalem: “in general the degree of responsibility increases as we draw further away from the man who uses the fatal instrument with his own hands.” If this analysis is correct, the general public is ultimately responsible in a democratic state for military action, where we freely elect our political officials, unlike a totalitarian dominion as in Nazi Germany. Furthermore, the “brutal moral calculus” finds its origin within the imagination of the populace, if our government says no to drone warfare because of the presumed need for the threat of brutality.
As a Christian citizen, I believe it is important to put forth ethical arguments and aims that limit responses and threats of brutality and violence in view of the Lord Jesus’ example of non-violent engagement of his enemies. Moreover, it is important to promote embodiment. To be more specific, as noted at different points in this article, physical distance does not equate with ethical distance or emotional and psychological distance. War should always be the last resort. Following that great prophet of non-violent engagement, Dr. Martin Luther King, Jr., what is needed is for us to try no matter how difficult to share our lives with those we so readily objectify as our enemies, know them, and hear their heart cry if we ever wish to attain meaningful, lasting solutions to conflicts, such as the Vietnam War in King’s day and the War on Terror in our own day. Moreover, as King also claims, we must never think our national security as Americans trumps that of others; in particular, we must seek to guard against posing an immanent and perpetual threat of insecurity and terror to other nations and peoples through drone warfare (especially to countries not able to protect themselves against such intrusions) for the sake of our own well-being.
The use or non-usage of science fiction in discussing drone warfare is not the fundamental issue. The real issue is whether or not to use drones in warfare. Whether or not Christian citizens in this or that country appeal to science fiction to make their case one way or another regarding drone warfare, we must make sure as Christians who are citizens that our ultimate appeal involves at least two cardinal values and cautions.
First, we must not set aside logic and reason so as to employ fear tactics to shift people’s thinking on the matter. After all, God has not given Christians a spirit “of fear,” “but of power and love and self-control” (2 Timothy 1:7—ESV; “sound mind” according to the KJV and “sound judgment” according to the Holman translation). Second, we must make sure as Christians who are citizens to think in view of God’s logos—the divine logic or reason who became human flesh as Jesus Christ (John 1:14). I cannot imagine Jesus (even if he were to use science fiction) making it possible for us to privilege one’s individual or collective well-being in a given society as being more important than the well-being of those in different societies. For Christians, we do not need to employ a brutal moral calculus to lessen the threat of warfare; rather we need Jesus’ kingdom calculus that would lead us to secure freedom by laying down our lives not simply for our loved ones but also for strangers in distant lands who would otherwise endure perpetual insecurity for our sake. This is not simply a theoretical matter for armchair theology, especially in democratic societies like my own. Christians (and no doubt other citizens) in democratic countries have a fundamental and perhaps increasing say and ethical responsibility on the matter given our political voice whether or not we control “the fatal instrument with” our “own hands.”
From the reverse angle, what if the ability existed and the decision was made to modify soldiers genetically so that they were less impacted emotionally by certain stimuli in battle? How does that account for freedom and other variables in making ethical choices? Would moral philosophers like Aristotle or Kant have approved?
See also the recommended stipulations set forth in a U.S. News and World Report opinion piece on the ethics of drone usage in warfare; the opinion piece acknowledges profound challenges posed by drone usage in combat but does not go as far as calling for a total ban on the development as well as production and use of autonomous weapon systems, unlike Human Rights Watch’s “Losing Humanity: The Case Against Killer Robots” and a similar report made by the Special Rapporteur to the United Nations.
Note an interesting article on the Cartoon Network’s The Clone Wars; the argument is made that the use of clones in warfare appears to lower not only the threshold for lethal violence among the Jedi Knights, but also among programmers and viewers of the animated version; see also the PBS article referred to above for its discussion of whether or not the use of drones will desensitize individuals and nations to war and increase warfare.
Hannah Arendt, Eichmann in Jerusalem: A Report on the Banality of Evil, revised and enlarged edition  (New York: Penguin Books, 1977), 246-47 (italics added by Arendt).
Listen here to Dr. King’s sermon “Why I Am Opposed to the War in Vietnam” delivered on April 30, 1967 at Ebenezer Baptist Church.