In Unorthodox Thoughts about Asymmetric Warfare, published in the current edition of PARAMETERS, the US Army War College Quarterly, Montgomery Meigs writes this eerily prescient passage:
Technology plays a critical role in this new equation. Strategically, from financial markets to transportation systems to electric power grids, standards of living worldwide depend fundamentally on integrated technical systems that are susceptible to idiosyncratic threats. The operational structures upon which campaigns depend have similar attributes. These systems may have internal safeguards against failure in normal operations, but they do not have an ability to avoid catastrophic failure when they are interrupted or attacked in an unexpected, unanticipated, and peculiar way that generates cascading or accelerating effects.And now we have a second example. Posted by Vanderleun at August 22, 2003 9:05 AMThe Northeast blackout of 9 November 1965 provides a useful example. At 5:16 p.m. on that day, an overcurrent relay on a transmission line from the Beck power plant outside of Toronto tripped and shut down one of six lines carrying power from that plant into the Canadian power grid that served Ontario. In 2.5 seconds -- to protect Becks generators from overload -- shutdowns rippled through the Canadian system, closing off the five other lines from the plant. The transmission systems in Ontario were linked to systems in New York. When the demand from Ontario went off-line, Becks output surged into the power grid in New York, almost doubling throughput. The overload began to surge through the US grid, threatening generation plants all over the Northeast. To protect their own generators, private utilities took their systems off-line, forcing the large public utilities to follow suit. In a total of four seconds, the Northeast went completely dark.4 The blackout represents the potential for catastrophic failure of technologically intensive systems with high degrees of interdependence. If one can find a weakness through which safety factors can be overloaded or bypassed, then manipulate the system in a self-destructive, eccentric manner, he can cause imploding, catastrophic failure.
The principle also applies in military operations. If one can attack the center of gravity of an operational system in an idiosyncratic manner with weapons or a combination of weapon systems that the opponent does not possessor, even better, does not even understand or perceivethen the perpetrator can achieve catastrophic failure of that system, whether the target is a transportation network or an integrated command and control grid. The potential effect increases to the degree that the system is technologically intensive and functionally or geographically integrated.
You are so right. The analysis following shows that in most part, there either no, or little procedure put in place to cover the worst case scenario that happened.
When I worked in the oil industry, we constantly thought outside of the box on what could improbably happen, and drilled on it. Usually somewhere 'down the pipe' that improbability would occur. Testing was redundant to the point of deadly boredom, but it saved our butts many a time.
HOME