Taken from the description of Petrov Day on LessWrong:
I find it interesting to think about Petrov day in the context of another Less Wrong post (“Ethical Injunctions” which I noodled over in “Not Even if it’s the Right Thing to Do“). The deterrence system wasn’t built to have Petrov’s qualms as a failsafe, and if they had been known ahead of time, they might have destabilized mutually assured destruction.
The story begins on September 1st, 1983, when Soviet jet interceptors shot down a Korean Air Lines civilian airliner after the aircraft crossed into Soviet airspace and then, for reasons still unknown, failed to respond to radio hails. 269 passengers and crew died, including US Congressman Lawrence McDonald. Ronald Reagan called it “barbarism”, “inhuman brutality”, “a crime against humanity that must never be forgotten”. Note that this was already a very, very poor time for US/USSR relations. Andropov, the ailing Soviet leader, was half-convinced the US was planning a first strike. The KGB sent a flash message to its operatives warning them to prepare for possible nuclear war.
On September 26th, 1983, Lieutenant Colonel Stanislav Yevgrafovich Petrov was the officer on duty when the warning system reported a US missile launch. Petrov kept calm, suspecting a computer error.
Then the system reported another US missile launch.
And another, and another, and another.
What had actually happened, investigators later determined, was sunlight on high-altitude clouds aligning with the satellite view on a US missile base.
In the command post there were beeping signals, flashing lights, and officers screaming at people to remain calm. According to several accounts I’ve read, there was a large flashing screen from the automated computer system saying simply “START” (presumably in Russian). Afterward, when investigators asked Petrov why he hadn’t written everything down in the logbook, Petrov replied,”Because I had a phone in one hand and the intercom in the other, and I don’t have a third hand.”
The policy of the Soviet Union called for launch on warning. The Soviet Union’s land radar could not detect missiles over the horizon, and waiting for positive identification would limit the response time to minutes. Petrov’s report would be relayed to his military superiors, who would decide whether to start a nuclear war.
Petrov decided that, all else being equal, he would prefer not to destroy the world. He sent messages declaring the launch detection a false alarm, based solely on his personal belief that the US did not seem likely to start an attack using only five missiles.
So why are Petrov’s system-bucking actions admirable? Is it just because he happened to be right? That’s not a very trustworthy heuristic, as almost everyone who thinks they’re an exception to the rule thinks their judgement is accurate.
In Petrov’s case, the institution he was defying was about to be wiped out. His move was analogous to the last round of a Prisoner’s Dilemma game, where his choice can’t inform the choices of others in the future.
Contrast this with the ethical injunction-defying behavior of the CIA, when they carried out sham vaccinations in order to get DNA samples and locate bin Laden’s family in Pakistan. By exploiting the trust we have in doctors, even just this once, even intending not to get caught, the CIA broke the system, and now, 160,000 children are not being vaccinated for polio.
So, if you think you’re an exception to the rule that nearly everyone who thinks they’re an exception to the rule is wrong argument, you’re probably wrong. And you might want to apply the back-up check that the CIA would have failed and Petrov passed: is your action going to collapse the institution or tradition that you’re sneaking around? If so, don’t.