Israel recently killed a top Iranian nuclear scientist. On its own, that might be news, but wouldn’t be a focus of what I talk about. However, the AI-assisted smart gun using facial recognition brings up a whole bunch of moral issues. Let’s look at the story then briefly discuss both the privacy issue with facial recognition and the issues of autonomous warfare.
Iranian Scientist Killed
Iran’s top nuclear scientist Mohsen Fakhrizadeh was assassinated by an AI-assisted sniper rifle remotely controlled by Israeli operatives. […]
The gun along with its advanced robotic apparatus weighed roughly a ton, according to the NYT. Israeli operatives smuggled the weapon and its parts piecemeal into Iran before reassembling it.
The entire system was then fitted into the bed of a pickup truck that contained multiple cameras in order to give Israeli operatives a full picture view of the surroundings. The truck was also packed with explosives in order to blow up any evidence after the mission was complete or compromised.
The gun itself was connected to an Israeli command center via a satellite communication relay. There an operative was able to control the gun and take aim at its target via a computer screen.
An AI was developed in order to compensate for Fakhrizadeh’s car movement, and the 1.6 second delay between the camera and what the operator saw. A facial recognition software was also employed to help ensure that only the Iranian scientist would be targeted by the rifle — sparing his wife’s life in the process.
It seems good that this greater precission could save the wife’s life, which is good, but is not the only concern.
Facial Recognition
I’ve written before about facial recognition software:
It may prevent some crime or catch some criminals. A man decides against shoplifting as he knows he will be caught as the store security camera will match his face against a universal database. Or, a criminal on the run get’s caught as he walked by a corner where a police camera uses facial recognition. These are undoubtedly good things. However, are they worth the cost?
In cities, there is already the sense that once you leave your house, every area you are in probably has a security camera recording somewhere. In some parts of cities, police have cameras on almost every corner. However, once these cameras are connected and with facial recognition technology, none of us would be able to go for a cup of coffee with a friend without that becoming part of the public record. If you are caught on camera publicly intoxicated, even if a friend drives you home, you might get a ticket in the mail.
A year ago, Sen. Durbin asked Mark Zuckerburg a few questions about privacy. One was: “Would you be comfortable sharing with us the name of the hotel you stayed in last night?” Obviously, even Zuckerburg was not comfortable sharing this. We all want a certain amount of privacy over what we do, but with wide use of facial recognition, this seems no longer possible.
Attaching this to guns can easily mean death not just a ticket in the mail for public intoxication. There is no way to appeal a bullet in the head, like if one had an alibi for when cited for public intoxication. This gun still seemed to have a human operator deciding when to fire to prevent that, but this does create a danger.
Autonomous Warfare
The big danger which we seem to be heading towards is autonomous warfare. When we have such precission guided remote control weapons where a human is certifying and setting it up to execute, we are only a step away from this.
The Vatican’s UN observer noted the issues with autonomous killer robots only last month.
The Vatican said Aug. 3 that a potential challenge was “the use of swarms of ‘kamikaze’ mini drones” and other advanced weaponry that utilize artificial intelligence in its targeting and attack modes. LAWS [Lethal Autonomouns Weapons Systems], the Vatican said, “raise potential serious implications for peace and stability.” […]
LAWS can “irreversibly alter the nature of warfare, detaching it further from human agency.” […]
“Many organizations and associations and government leaders would agree on the same thing, that life and death decisions cannot be outsourced to a machine. The Holy See has said it as well as anybody,” said Jonathan Frerichs, Pax Christi International’s U.N representative for disarmament in Geneva.
He told Catholic News Service it is unethical to believe “that we could pretend our future can be served by life and death decisions that can be turned over to an algorithm.”
This is not irrelevant today. There is a review of the treaty banning certain non-nuclear weapons in December and LAWS could easily be added. We should pray for this meeting in Geneva and prepare what we can to support the banning of such weapons.
Conclusion
Weapons that can be more precise are beneficial as they reduce collateral death and injury to innocent people. It’s far better if, in warfare, you can just blow up an enemy base and not the school a block away as well. This has become even more important with things like seeking out terrorists than when fighting national armies. Facial recognition can help with this, especially when dealing with a remote weapon that would need to follow a target and would only get a signal a second after the human approved firing the gun. However, we need to be aware of the issues of privacy with facial recognition, possible errors with facial recognition, and the significant moral issues with the completely autonomous warfare this seems directed towards. We need to pray autonomous weapons are banned when the treaty is reviewed this December.
Notes:
- I left aside the issue of whether this is legitimate self-defense or just war on the part of Israel as that would be a whole other post. It would seem likely that it would be legitimate self-defense if he is in the process of making a weapon that his bosses have said will be used to kill millions of Irsaelis, but this could be debated.
- Please support me on Patreon so I can keep writing more analysis like this explaining Catholic moral teaching on current events and especially privacy.