How to dodge a drone

Al-Qaeda is circulating a list of 22 techniques to thwart drone attacks–ones experts say are effective–showing how terrorists keep adapting to efforts to counter them.  I give the list after the jump. [Read more...]

And now “zombie guns”

Russia has developed weapons that disable to various degrees the central nervous system.  They have already been used in “crowd control,” suggesting that they are not just for military purposes but for internal policing and even, for political control.   But what I want to know as an American is whether  the Second Amendment applies to zombie guns too!

Vladimir Putin has confirmed Russia has been testing mind-bending psychotronic guns that can effectively turn people into zombies.

The futuristic weapons – which attack their victims’ central nervous system – are being developed by scientists and could be used against Russia’s enemies and even its own dissidents by the end of the decade.

Mr Putin has described the guns, which use electromagnetic radiation like that found in microwave ovens, as entirely new instruments for achieving political and strategic goals.

Plans to introduce the super-weapons were announced by Russian defence minister Anatoly Serdyukov.

While the technology has been around for some time, MrTsyganok said the guns were recently tested for crowd control purposes.

“When it was used for dispersing a crowd and it was focused on a man, his body temperature went up immediately as if he was thrown into a hot frying pan,” Mr Tsyganok said.

“Still, we know very little about this weapon and even special forces guys can hardly cope with it,” he said.

Research into electromagnetic weapons has been carried out in the US and Russia since the ’50s but it appears Putin has stolen a march on the US.

Precise details have not been revealed but previous research has shown that low-frequency waves or beams can affect brain cells, alter psychological states and make it possible to transmit suggestions and commands directly into someone’s thoughts.

Mr Putin said the technology is comparable in effect to nuclear weapons but “more acceptable in terms of political and military ideology”.

via Russia working on electromagnetic radiation guns | Space, Military and Medicine | Herald Sun.

Predator drones for bad guys

A predator drone killed Anwar al-Awlaki, an al-Qaida leader, propagandist, and recruiter.  Complicating the matter is that he and one of his minions also killed in the Yemen attack were  American citizens.   Some are concerned that executing an American like this is a violation of our constitutional rights for due process and a fair trial.  Others say that al-Awlaki is a textbook example of a traitor who is fighting on the side of his country’s enemies and that being killed in this quasi-military operation is what happens in war.  It has nothing to do with the legal system.

Some of you have already been arguing about this is another post (rather than staying on topic), but let’s take this in another direction.  Do you think predators should be used in law enforcement?

In which of these cases would you support their use?

(1)  Against the Mexican drug lords who are terrorizing Mexico (in consultation with that country’s authorities)?

(2)  Against domestic organized crime leaders?

(3)  In situations that call for deadly force, such as against snipers or holed-up killers, as another weapon in the arsenal of SWAT teams?

(4)  To patrol dangerous neighborhoods?

(5)  As a high tech cop on the beat, used mostly for surveillance but carrying a weapon?

(6)  Used for surveillance but without the Hellfire missile?

How would you handle the constitutional issues?  Is this just another weapon or just another tool, or are there particular legal or moral problems with it?

Help us draw some lines.

via Anwar al-Awlaki: Is killing US-born terror suspects legal? – CSMonitor.com.

Drones that kill on their own

On the horizon of military technology:  Drones that “decide” on their own whom to kill:

One afternoon last fall at Fort Benning, Ga., two model-size planes took off, climbed to 800 and 1,000 feet, and began criss-crossing the military base in search of an orange, green and blue tarp.

The automated, unpiloted planes worked on their own, with no human guidance, no hand on any control.

After 20 minutes, one of the aircraft, carrying a computer that processed images from an onboard camera, zeroed in on the tarp and contacted the second plane, which flew nearby and used its own sensors to examine the colorful object. Then one of the aircraft signaled to an unmanned car on the ground so it could take a final, close-up look.

Target confirmed.

This successful exercise in autonomous robotics could presage the future of the American way of war: a day when drones hunt, identify and kill the enemy based on calculations made by software, not decisions made by humans. Imagine aerial “Terminators,” minus beefcake and time travel.

The Fort Benning tarp “is a rather simple target, but think of it as a surrogate,” said Charles E. Pippin, a scientist at the Georgia Tech Research Institute, which developed the software to run the demonstration. “You can imagine real-time scenarios where you have 10 of these things up in the air and something is happening on the ground and you don’t have time for a human to say, ‘I need you to do these tasks.’ It needs to happen faster than that.”

The demonstration laid the groundwork for scientific advances that would allow drones to search for a human target and then make an identification based on facial-recognition or other software. Once a match was made, a drone could launch a missile to kill the target. . . .

Research into autonomy, some of it classified, is racing ahead at universities and research centers in the United States, and that effort is beginning to be replicated in other countries, particularly China.

“Lethal autonomy is inevitable,” said Ronald C. Arkin, the author of “Governing Lethal Behavior in Autonomous Robots,” a study that was funded by the Army Research Office.

Arkin believes it is possible to build ethical military drones and robots, capable of using deadly force while programmed to adhere to international humanitarian law and the rules of engagement. He said software can be created that would lead machines to return fire with proportionality, minimize collateral damage, recognize surrender, and, in the case of uncertainty, maneuver to reassess or wait for a human assessment.

In other words, rules as understood by humans can be converted into algorithms followed by machines for all kinds of actions on the battlefield.

via A future for drones: automated killing – The Washington Post.

The article alludes to “ethical” and “legal” issues that need to be worked out with this particular technology.  Like what?  Is automating war to this extent a good idea?  Does this remove human responsibility and guilt for taking a particular human life?  Might this kind of technology develop, eventually, into a military without actual human beings in overt combat?  How could this be abused?


CLOSE | X

HIDE | X