Drones that kill on their own

On the horizon of military technology:  Drones that “decide” on their own whom to kill:

One afternoon last fall at Fort Benning, Ga., two model-size planes took off, climbed to 800 and 1,000 feet, and began criss-crossing the military base in search of an orange, green and blue tarp.

The automated, unpiloted planes worked on their own, with no human guidance, no hand on any control.

After 20 minutes, one of the aircraft, carrying a computer that processed images from an onboard camera, zeroed in on the tarp and contacted the second plane, which flew nearby and used its own sensors to examine the colorful object. Then one of the aircraft signaled to an unmanned car on the ground so it could take a final, close-up look.

Target confirmed.

This successful exercise in autonomous robotics could presage the future of the American way of war: a day when drones hunt, identify and kill the enemy based on calculations made by software, not decisions made by humans. Imagine aerial “Terminators,” minus beefcake and time travel.

The Fort Benning tarp “is a rather simple target, but think of it as a surrogate,” said Charles E. Pippin, a scientist at the Georgia Tech Research Institute, which developed the software to run the demonstration. “You can imagine real-time scenarios where you have 10 of these things up in the air and something is happening on the ground and you don’t have time for a human to say, ‘I need you to do these tasks.’ It needs to happen faster than that.”

The demonstration laid the groundwork for scientific advances that would allow drones to search for a human target and then make an identification based on facial-recognition or other software. Once a match was made, a drone could launch a missile to kill the target. . . .

Research into autonomy, some of it classified, is racing ahead at universities and research centers in the United States, and that effort is beginning to be replicated in other countries, particularly China.

“Lethal autonomy is inevitable,” said Ronald C. Arkin, the author of “Governing Lethal Behavior in Autonomous Robots,” a study that was funded by the Army Research Office.

Arkin believes it is possible to build ethical military drones and robots, capable of using deadly force while programmed to adhere to international humanitarian law and the rules of engagement. He said software can be created that would lead machines to return fire with proportionality, minimize collateral damage, recognize surrender, and, in the case of uncertainty, maneuver to reassess or wait for a human assessment.

In other words, rules as understood by humans can be converted into algorithms followed by machines for all kinds of actions on the battlefield.

via A future for drones: automated killing – The Washington Post.

The article alludes to “ethical” and “legal” issues that need to be worked out with this particular technology.  Like what?  Is automating war to this extent a good idea?  Does this remove human responsibility and guilt for taking a particular human life?  Might this kind of technology develop, eventually, into a military without actual human beings in overt combat?  How could this be abused?

About Gene Veith

Professor of Literature at Patrick Henry College, the Director of the Cranach Institute at Concordia Theological Seminary, a columnist for World Magazine and TableTalk, and the author of 18 books on different facets of Christianity & Culture.

  • http://www.allenthemelancholy.com/ Allen

    The most underreported story of the past 10 years is the total revolution in how the U.S. fights wars. “Fighter pilots” fly drones from U.S. soil, kill and destroy, then go home to their families for supper. Drones destroying on their own. Civilian agents outside of the military command structure being intertwined and making real time decisions with drones. Civilian information agents intertwined with military (Bin Laden Raid). Ethics has not kept pace.

    The conflicts of the past 10 years have totally redefined the definition of war. Just like the U.S. Civil War helped set the types of wars fought in WW1 and WW2, these conflicts define how wars will be fought int the future.

  • http://www.allenthemelancholy.com/ Allen

    The most underreported story of the past 10 years is the total revolution in how the U.S. fights wars. “Fighter pilots” fly drones from U.S. soil, kill and destroy, then go home to their families for supper. Drones destroying on their own. Civilian agents outside of the military command structure being intertwined and making real time decisions with drones. Civilian information agents intertwined with military (Bin Laden Raid). Ethics has not kept pace.

    The conflicts of the past 10 years have totally redefined the definition of war. Just like the U.S. Civil War helped set the types of wars fought in WW1 and WW2, these conflicts define how wars will be fought int the future.

  • SKPeterson

    Terminator. RoboCop. Do we need any more warnings on the potential catastrophes awaiting mankind?

  • SKPeterson

    Terminator. RoboCop. Do we need any more warnings on the potential catastrophes awaiting mankind?

  • Dan Kempin

    “Lethal autonomy is inevitable,” said Ronald C. Arkin,

    I agree.

    But is this an ethical dilemma? No more than any of the other implements of modern war. What of the artillery that fires beyond the ability of the artillerist to see? What of the missile that is launched on a target from over the horizon? It seems that these are all variations of the same concept.

    Are these autonymous robots that make their own judgments on what is worthy of life? It hardly seems so. They are very specifically programmed and verified the target three times. Frankly, they will probably improve the battlefield by reducing human errors and friendly/civilian fire. The drones will never attack because they are frightened or angry.

    “Does this remove human responsibility?” Of course not. And I doubt that the emotional scars of war have really changed all that much from the time of the spear and shield. Yes, also, to the “can it be abused?” argument, which can be applied to the sword and spear as well, along with the handgun, kitchen knife, fist, words, and pretty much everything else.

  • Dan Kempin

    “Lethal autonomy is inevitable,” said Ronald C. Arkin,

    I agree.

    But is this an ethical dilemma? No more than any of the other implements of modern war. What of the artillery that fires beyond the ability of the artillerist to see? What of the missile that is launched on a target from over the horizon? It seems that these are all variations of the same concept.

    Are these autonymous robots that make their own judgments on what is worthy of life? It hardly seems so. They are very specifically programmed and verified the target three times. Frankly, they will probably improve the battlefield by reducing human errors and friendly/civilian fire. The drones will never attack because they are frightened or angry.

    “Does this remove human responsibility?” Of course not. And I doubt that the emotional scars of war have really changed all that much from the time of the spear and shield. Yes, also, to the “can it be abused?” argument, which can be applied to the sword and spear as well, along with the handgun, kitchen knife, fist, words, and pretty much everything else.

  • http://theoldadam.wordpress.com Steve Martin

    No machine made by man kills on it’s own.

  • http://theoldadam.wordpress.com Steve Martin

    No machine made by man kills on it’s own.

  • Joe

    “There was a war. A few years from now. Nuclear war. The whole thing. All this … everything … is gone. Just gone. There were survivors. Here. There. Nobody knew who started it. It was the Machines… Defense network computer. New. Powerful. Hooked into everything. Trusted to run it all. They say It got smart…a new order of intelligence. Then It saw all people as a threat, not just the ones on the other side. Decided our fate in a microsecond… extermination.”

    – Kyle Reese

  • Joe

    “There was a war. A few years from now. Nuclear war. The whole thing. All this … everything … is gone. Just gone. There were survivors. Here. There. Nobody knew who started it. It was the Machines… Defense network computer. New. Powerful. Hooked into everything. Trusted to run it all. They say It got smart…a new order of intelligence. Then It saw all people as a threat, not just the ones on the other side. Decided our fate in a microsecond… extermination.”

    – Kyle Reese

  • Kirk

    I, for one, welcome our new robot overlords.

  • Kirk

    I, for one, welcome our new robot overlords.

  • Jonathan

    There is a legal concept known as the Law of War. It has certain principles like distinction, discrimination, proportionality, lawful weaponry, command, and so on.

    Such an “autonomous” system may indeed be legal if it can meet the intent of the LOW. If it can follow orders, truly discriminate the enemy from protected places and persons, minimize collateral damage. In other words, make human decisions that a soldier has to make.

    In other words, it can’t act like the Skynet terminators.

  • Jonathan

    There is a legal concept known as the Law of War. It has certain principles like distinction, discrimination, proportionality, lawful weaponry, command, and so on.

    Such an “autonomous” system may indeed be legal if it can meet the intent of the LOW. If it can follow orders, truly discriminate the enemy from protected places and persons, minimize collateral damage. In other words, make human decisions that a soldier has to make.

    In other words, it can’t act like the Skynet terminators.

  • Dan Kempin

    Joe, #5,

    I’m trying to guess if you looked that up or were able to recite it from memory.

  • Dan Kempin

    Joe, #5,

    I’m trying to guess if you looked that up or were able to recite it from memory.

  • Bryan Lindemood

    I decided on my own long ago that I didn’t like drones. I like them less now. People do very sad things with great ideas.

  • Bryan Lindemood

    I decided on my own long ago that I didn’t like drones. I like them less now. People do very sad things with great ideas.

  • Dr. Luther in the 21st Century

    What ever happened to Asimov’s three laws?

  • Dr. Luther in the 21st Century

    What ever happened to Asimov’s three laws?

  • Joe

    Dan – Ilooked it up to verify that I had it right and found out that my memory was close but not completely correct. I could have sworn SkyNet was mentioned by name in that quote.

  • Joe

    Dan – Ilooked it up to verify that I had it right and found out that my memory was close but not completely correct. I could have sworn SkyNet was mentioned by name in that quote.

  • Tom Hering

    An even scarier possibility is robotic systems calling us at home during political campaigns. Thank God it’s unlikely.

  • Tom Hering

    An even scarier possibility is robotic systems calling us at home during political campaigns. Thank God it’s unlikely.

  • http://www.biblegateway.com/versions/Contemporary-English-Version-CEV-Bible/ sg

    Interesting book on the topic:

    Wired for War

    http://wiredforwar.pwsinger.com/

    also, the author is currently taking a poll on the question “Will the robotics revolution make war easier or harder to start?”

  • http://www.biblegateway.com/versions/Contemporary-English-Version-CEV-Bible/ sg

    Interesting book on the topic:

    Wired for War

    http://wiredforwar.pwsinger.com/

    also, the author is currently taking a poll on the question “Will the robotics revolution make war easier or harder to start?”

  • –helen

    Tom Hering @12
    An even scarier possibility is robotic systems calling us at home during political campaigns. Thank God it’s unlikely.

    Right! 8-D
    Last night I got a call from Ed Meese, introducing himself by referring to his job with Reagan.
    IF he’s still alive (I haven’t looked it up yet.) do you think he’s calling me personally?
    [Neither do the other politicians; I'm removing that phone.]

  • –helen

    Tom Hering @12
    An even scarier possibility is robotic systems calling us at home during political campaigns. Thank God it’s unlikely.

    Right! 8-D
    Last night I got a call from Ed Meese, introducing himself by referring to his job with Reagan.
    IF he’s still alive (I haven’t looked it up yet.) do you think he’s calling me personally?
    [Neither do the other politicians; I'm removing that phone.]

  • SKPeterson

    Helen @14 – Who said Ed Meese might not have called you, or at least his cyborg remnant? For all we know, it is cyborgian Ed Meese who is controlling the drones. What better way to combat cyber-porn than to cyberize the man who sought to end pornography when he was still fully living human flesh? And, face it, if there’s a zombie apocalypse don’t you want to have a large number of back-up cyborg lives and drones to use in wiping out the zombie hordes?

  • SKPeterson

    Helen @14 – Who said Ed Meese might not have called you, or at least his cyborg remnant? For all we know, it is cyborgian Ed Meese who is controlling the drones. What better way to combat cyber-porn than to cyberize the man who sought to end pornography when he was still fully living human flesh? And, face it, if there’s a zombie apocalypse don’t you want to have a large number of back-up cyborg lives and drones to use in wiping out the zombie hordes?

  • http://enterthevein.wordpress.com J. Dean

    This opens a dangerous can of worms. Machines can malfunction, and I don’t want to be around one of these things when that happens. I don’t care how many failsafes may be included: things go wrong with machines, and it’s not pretty when it happens.

  • http://enterthevein.wordpress.com J. Dean

    This opens a dangerous can of worms. Machines can malfunction, and I don’t want to be around one of these things when that happens. I don’t care how many failsafes may be included: things go wrong with machines, and it’s not pretty when it happens.

  • HistoryProfBrad

    Reminds me of the film from the early ’70s called “Colossus: The Forbin Project.” I guess we have forgotten the notion that just because you CAN do something does not mean that you SHOULD do something.

  • HistoryProfBrad

    Reminds me of the film from the early ’70s called “Colossus: The Forbin Project.” I guess we have forgotten the notion that just because you CAN do something does not mean that you SHOULD do something.

  • Jon

    Why all the angst?
    How about our right to keep and bear drones, or is Obama going to take that away too?! Drones don’t kill people; people kill people. I’ll give you my drone when you pry its launch mechanism from my cold, dead hand. If drones are outlawed, only outlaws will have drones. Etc.

  • Jon

    Why all the angst?
    How about our right to keep and bear drones, or is Obama going to take that away too?! Drones don’t kill people; people kill people. I’ll give you my drone when you pry its launch mechanism from my cold, dead hand. If drones are outlawed, only outlaws will have drones. Etc.

  • http://steadfastlutherans.org/ SAL

    This is a Christmas song but it seems appropriate.

    http://www.youtube.com/watch?v=B3DyxaCYlfg

    I suspect that eventually most pilots will be retired as drones become more maneuverable than anything a human could survive flying.

    However the Soldiers and Marines will be necessary on the ground in any sort of war that features attempts to hold territory or establish order. That means soldiers and marines will be fighting and dying in the flesh long after the Navy and Air Force have been handed over to drones.

  • http://steadfastlutherans.org/ SAL

    This is a Christmas song but it seems appropriate.

    http://www.youtube.com/watch?v=B3DyxaCYlfg

    I suspect that eventually most pilots will be retired as drones become more maneuverable than anything a human could survive flying.

    However the Soldiers and Marines will be necessary on the ground in any sort of war that features attempts to hold territory or establish order. That means soldiers and marines will be fighting and dying in the flesh long after the Navy and Air Force have been handed over to drones.

  • http://somewebsite.somedomain.com C-Christian Soldier

    before the new RoEs ( Rules of Engagements) – I would have had second thoughts about the new drones-
    now- I do not- let them identify and destroy our enemies…w/o PC involvement from cloistered folks -telling our live BEST what to do…
    Carol-CS

  • http://somewebsite.somedomain.com C-Christian Soldier

    before the new RoEs ( Rules of Engagements) – I would have had second thoughts about the new drones-
    now- I do not- let them identify and destroy our enemies…w/o PC involvement from cloistered folks -telling our live BEST what to do…
    Carol-CS

  • Tom Hering

    “… soldiers and marines will be fighting and dying in the flesh long after the Navy and Air Force have been handed over to drones.”

    Don’t count on it. The Army wants autonomous robot soldiers, and it wants them bad. They’re spending a lot of money on them already. Will they be Terminators? That’s not the plan:

    http://www.telegraph.co.uk/news/worldnews/northamerica/usa/3536943/Pentagon-hires-British-scientist-to-help-build-robot-soldiers-that-wont-commit-war-crimes.html

    Okay, so they probably won’t be Terminators. But will they be lovable? Yes, whether they’re designed to be or not:

    http://www.washingtonpost.com/wp-dyn/content/article/2007/05/05/AR2007050501009.html

    (Note that both articles are a few years old. I guess most of us are a little behind the times.)

  • Tom Hering

    “… soldiers and marines will be fighting and dying in the flesh long after the Navy and Air Force have been handed over to drones.”

    Don’t count on it. The Army wants autonomous robot soldiers, and it wants them bad. They’re spending a lot of money on them already. Will they be Terminators? That’s not the plan:

    http://www.telegraph.co.uk/news/worldnews/northamerica/usa/3536943/Pentagon-hires-British-scientist-to-help-build-robot-soldiers-that-wont-commit-war-crimes.html

    Okay, so they probably won’t be Terminators. But will they be lovable? Yes, whether they’re designed to be or not:

    http://www.washingtonpost.com/wp-dyn/content/article/2007/05/05/AR2007050501009.html

    (Note that both articles are a few years old. I guess most of us are a little behind the times.)

  • Cincinnatus

    Eh, we decided our fate when we abandoned our swords for firearms.

  • Cincinnatus

    Eh, we decided our fate when we abandoned our swords for firearms.

  • WebMonk

    (sorry to come in late) This is old hat, to some extent. The army has already been experimenting with automatic gun emplacements for defensive purposes.

    The gun senses enemy targets sneaking up using a variety of sensors, thermal, low light, radar, lidar, etc, and then the results are analyzed to see if they qualify for being an enemy. The gun can then aim and shoot itself in a small fraction of the time a human would take.

    There were a lot of concerns about the accuracy of its determination of enemy combatant or not, so they added a human in the loop who watches the sensor results and confirms or denies that it is a valid target.

  • WebMonk

    (sorry to come in late) This is old hat, to some extent. The army has already been experimenting with automatic gun emplacements for defensive purposes.

    The gun senses enemy targets sneaking up using a variety of sensors, thermal, low light, radar, lidar, etc, and then the results are analyzed to see if they qualify for being an enemy. The gun can then aim and shoot itself in a small fraction of the time a human would take.

    There were a lot of concerns about the accuracy of its determination of enemy combatant or not, so they added a human in the loop who watches the sensor results and confirms or denies that it is a valid target.

  • Jonathan

    @14 Helen,

    A quick tODDing reveals that Mr. Meese (a Lutheran, BTW) is still this side of heaven.

  • Jonathan

    @14 Helen,

    A quick tODDing reveals that Mr. Meese (a Lutheran, BTW) is still this side of heaven.

  • Jonathan

    @23 the Phalanx (Sea whiz) type systems you mention are different in that they are defensive and operate on principle no differently than, say, a claymore mine with a trip wire, or, for that matter, any other type of antipersonel land mine.

    The unique thing described here ultimately would be an autonomous system that offensively seeks out and identifies a target and takes it out, following all the LOW, ROE, and campaign objectives and strategy. In other words, a cyber soldier.

  • Jonathan

    @23 the Phalanx (Sea whiz) type systems you mention are different in that they are defensive and operate on principle no differently than, say, a claymore mine with a trip wire, or, for that matter, any other type of antipersonel land mine.

    The unique thing described here ultimately would be an autonomous system that offensively seeks out and identifies a target and takes it out, following all the LOW, ROE, and campaign objectives and strategy. In other words, a cyber soldier.


CLOSE | X

HIDE | X