Why South Korea is leading the way in building killer robots

When I moved this blog to Patheos, I was asked to pick a tagline and I ended up going with the first thing that came to mind: “Philosophy, atheism, killer robots.” Roll your cursor over the browser tag, you should be able to see it This was not a throw-away joke: I really do mean to use this blog to write about killer robots.

To that end, I want to talk about the fact that South Korea is leading the way in building killer robots. Specifically, it’s leading the way in building fully autonomous military robots. The word “fully autonomous” is key here. You hear a lot about the use of drone warfare by the US, but current US drone warfare keeps a human in the loop–a human needs to pull the trigger in order for the drone to kill someone. Recently, the Pentagon has tried to insist humans will always be “in the loop.”

The South Korean military, on the other hand, has already deployed autonomous sentry guns capable of being set to identify and fire on targets in a fully autonomous manner (though allegedly this setting isn’t being used). Why is South Korea leading the way here? There’s a good reason for it, and it has nothing to do with stereotypes about Asian technophilia.

To understand what’s going on here, it’s worth mentioning a little fact I learned while in Cambodia: Cambodia has the highest number of amputees per capita in the world. That’s because during Cambodia’s civil war in the 1970s, lots of landmines got planted and after the shooting stopped the landmines didn’t magically disappear.

To give you another idea of how bad the problem is, if you visit Cambodia as a tourist, the guidebooks will tell you that once you get out of the city centers, you shouldn’t stray from well-trodden paths. That’s because there could be landmines, even in tourist-heavy areas. Cambodia is exhibit A for why there have been attempts to ban landmines internationally.

Except the United States is not totally on board with the idea of banning landmines. That’s because we’ve mined the fuck out of the Korean Demilitarized Zone (or DMZ for short), the narrow strip of land that divides North and South Korea. And arguably, this is morally justifiable, because everyone in Korea knows about the DMZ and knows that if you randomly wander in to the DMZ, you not only risk stepping on a landmine but also risk getting shot. So the chances of some kid wandering into the DMZ and stepping on a mine is low.

(As an aside, the Korean DMZ is one of the most hilariously misnamed places on the planet. In spite of the name, both sides of the DMZ are actually heavily fortified.)

And that’s why South Korea can lead the way on killer robots. A really dumb killer robot (at the level we actually know how to build right now), that just shot anything that moved, would be hugely problematic for the same reasons that landmines are hugely problematic. But maybe something like that could be okay for a place like the DMZ (not that the Korean robot is even quite that indiscriminate–supposedly it can tell when a soldier is surrendering, though I can’t confirm how well that function works).

Yet this development still seems worrisome. Grant for the sake of argument that it would be okay to deploy a not-very-discriminating robot to the DMZ. We should still be worried that the military might be tempted to try to extend that logic to cases where it really doesn’t apply. Like, I don’t know, we could establish a no-fly zone over Iran and have relatively dumb autonomous drones police it on the theory that anyone who violates the no fly zone should have known better, but then we don’t publicize it well enough and oh crap one of our drones just blew up a plane with a bunch of kids on it.

One of the things that convinces me that these worries need to be taken seriously is reading the recently released book Kill Anything That Moves: The Real American War in Vietnam (which I plan to devote a post to in the near-future). Among the borderline-genocidal policies described in the book is the use of “free-fire zones”: areas where theoretically everyone knew you weren’t supposed to go, and therefore anyone there could be assumed to be VC and shot, but which an awful lot of civilians failed to stay out of for various reasons and got killed.

The lesson from free-fire zones in Vietnam is simple: a policy that looks good on paper (“we don’t have to worry about civilian casualties, because there will be no civilians in the free fire zones”) can be murderous in practice. Something similar could happen with our killer robot policy. For that reason, I suspect the best thing to do may be to ban fully autonomous killer robots (“man in the loop” drones are another manner) entirely until we can build robots with much better systems for distinguishing legitimate and illegitimate targets.

Page image from Texts from Drone.

  • Kevin

    I’m actually kind of open to the idea as long as they test it out via non-lethal means first. Like, they design the system to track an incoming person, and instead of mounting a gun to fire, you can have a fog light to expose the person. You might color code the light, red means unfriendly person (where if it were attached to a gun, it would fire itself), green for friendly, normal for unknown. Ideally, this should be better than a person since a computer won’t have a self-preservation attitude of shoot first and won’t perceive everything as a weapon. However, there will still be a huge security risk if anyone were to switch the commands to fire to the friendly person setting. A different scenario of SkyNet playing out?

  • Ophis

    I’m generally supportive of this kind of technology both autonomous and human-operated, and I think a lot of the opposition to it comes from looking at the potential problems in isolation, without comparing them to the problems of what we have now. For example, if your hypothetical no-fly zone wasn’t enforced by drones, it would be enforced by human pilots, probably operating under the same rules that the drones would be, and the same planeful of kids would still be blown up. They’d just be blown up by a missile from a plane rather than a missile from a drone.

    You also note the problem of distinguishing targets, without noting the often shitty target-identification system that is a human serviceman. The standard for using drones shouldn’t be perfection or near-perfection. It should be equal or better performance than a human, who can panic, get paranoid, get sweat in their eyes, get tired and aggressive from working insane hours, get distracted thinking about people at home, get lax because they’ve got 4 days till their end of tour; and who may therefore get himself or others killed due to problems which robots are immune to.

    • Chris Hallquist

      In theory, I’d definitely agree that AIs have the potential to be more ethical than humans. But only once the technology gets a lot better. The laws of war are all written in terms of concepts that robots are beyond incapable of comprehending right now. While there’s a lot to be desired in terms of human implementation of the laws of war, humans are at least capable of understanding them.

  • http://www.convergence-state.com Luke

    Some say a killer robot COULD act in a more ethical manner, (I have my doubts) but keep in mind, that’s up to the designers/programmers and the state that employs them. More worryingly, the robot represents a form of asymmetrical warfare, in the sense that an opponent who does not have access to the same technology will face human losses, while the destruction of robotic forces merely represents the loss of machinery. The lack of an emotional toll such a conflict would create for the robot armed side (presumably the superior force) would not only lower the bar in terms of engaging in conflict,but also lower the urgency to end such a conflict. It also could lend itself to creating a notion that attacks w/in the robotic army’s host country on civilians would be the only way to inflict real psychological damage to that country. The global proliferation of all sizes of drones will further enable such a scenario.


CLOSE | X

HIDE | X