Steven Pearlstein reflects on our recent disasters, all of which took us by surprise and none of which we were prepared for:
In just the past decade, we’ve had the attacks of Sept. 11, the tsunami in the Indian Ocean, Hurricane Katrina, the global financial crisis, a global flu pandemic, the earthquake in Haiti, the oil spill in the Gulf of Mexico, and devastating floods in Australia and New Zealand. Now, Japan has been hit with a triple whammy of earthquake, tsunami and nuclear crisis.
What all of these have in common is that they are all low-probability, high-impact events — the “long-tail” phenomenon, to use the jargon of risk modelers, referring to the far ends of the traditional bell curve of probabilities, or “black swans,” to use the metaphor popularized by former Wall Street trader Nassim Nicholas Taleb.
Such calamitous events have been a regular part of the human experience since Noah and the flood, some of them natural, others manmade. In spite of that, however, we continue to underestimate their frequency and severity.
To a degree, that is a good thing. If we were to focus too much of our attention on all the really, really bad things that could befall us, we’d never get out of bed in the morning.
But the same psychological trait that allows us to go about our daily business also creates blind spots. Although we observe that calamities happen, we assume that they won’t happen to us, or they won’t happen again. And if it has been a long time since the calamity, we are apt to take false comfort that we have beaten the odds. . . .
Part of the problem is that we don’t know what we don’t know. The other part is that small miscalculations of probabilities can have large effects on outcomes when dealing with long periods of time. Think of the sailor who sets off on a voyage a few degrees off course. A few miles out, the error is small, but by the time he crosses the ocean, he may find himself hundreds of miles from the intended destination.
Our reward structures don’t encourage spending the time or the money to deal with low-probability disasters. The chief executive of Citigroup acknowledged as much when he told a reporter in 2007 that he would lose his job if he gave up profit and market share to shield his bank from the obviously excessive risk-taking that everyone knew was going on. And you can only imagine the outcry from the industry and those Gulf Coast politicians if government regulators back in 2009 had ordered oil companies to spend millions of dollars to have enough boats and booms at the ready to deal with a BP-sized oil spill from deepwater drilling.Indeed, it seems that when we conclude that the chance of something really bad happening is very small, we wind up taking actions that either increase the probability of the disaster or the damage that it will cause. Once the rocket scientists on Wall Street, for example, concluded that it was virtually impossible for investors in so-called “mezzanine” tranches of mortgage-backed securities to lose money, it set off a chain of events that made the prediction untrue. The heavy demand for the securities led to dramatically lower lending standards and a sharp increase in housing prices, creating a bubble so large that when it burst, it caused heavy losses for those same mezzanine investors. The declaration that a particular investment was riskless became a self-negating prophecy.
Similarly, when the government builds a levee, it may reduce the frequency of damaging floods but may also encourage even more people to build homes and businesses behind the barrier. When the Big One finally arrives, the total damage will be even greater than if no levee had been built.
We’re also discovering that the impact of disasters is magnified by globalization. The troubles in northern Japan, for example, are beginning to ripple through global supply chains, creating bottlenecks and shortages in dozens of industries. The way globalization increases economic efficiency is by leveraging the advantages of scale and specialization. Yet the bigger and more concentrated production becomes, the more vulnerable it becomes to disruption.
Many scholars now think that the very complexity of modern life — including our transportation and communication systems, our economy and our social interactions — is directly implicated in the severity of catastrophes. In more complex systems, even small changes or perturbations can have disproportionate and unpredictable effects. The things that make our systems more efficient also make them more effective in spreading the impact of a catastrophe.