Command and Control
The Story of Nuclear Weapons and the Illusion of Safety

Eric Schlosser, 2014

Turning and turning in the widening gyre
The falcon cannot hear the falconer;
Things fall apart; the centre cannot hold;
Mere anarchy is loosed upon the world

It took me a long time to get around to reading this book. Over and over I'd pick it up in bookstores, only to sigh and place it back on the shelf. A history of nuclear weapons seemed interesting to be sure, but it was too distant, too academic, too irrelevant to the present day. Perhaps I'd read one of Schlosser's excellent New Yorker features on the subject, but 500 pages was a touch extravagant.

Once I finally gave in, the depth of my ignorance became radically, fascinatingly, terrifyingly apparent. Command and Control is shown to be an unnervingly ironic title, as Schlosser shines a light on just how little we've had of either, and leaves the reader simply awestruck by the fact that we've been so lucky thus far.

Command and Control unfolds in two interwoven strands. The first is a conventional yet compelling history of nuclear weapons. The second is a harrowing account of the 1980 Damascus Incident, when a Titan II intercontinental ballistic missile exploded in its silo under the soil of rural Arkansas, throwing its 9 megaton W53 warhead hundreds of feet into the air, though thankfully without detonating it. Bill Clinton would be much more famous as the governor who lost half of Arkansas if it had.

Throughout the book, Schlosser draws heavily on Charles Perrow's seminal work Normal Accidents. While it may be simple to attribute the explosion at Damascus to a dropped socket from an improperly sized socket wrench, Perrow's study of "trivial events in non-trivial systems" regards the cause as an inevitable result of the architecture of the system. Key to Perrow's risk typology is the idea of tightly coupled systems, where, during a failure mode, system components can interact in new and unforeseen ways, and in a way that doesn't permit simple intervention at one point by a human operator. As Perrow states, "No one dreamed that when X failed, Y would also be out of order, and the two failures would interact so as to both start a fire and silence the fire alarm". For compelling examples, it's worth checking out the (surprisingly captivating) YouTube channel of the U.S. Chemical Safety Board, which features detailed reconstructions of major industrial accidents, and the complexity that underlies them.

While this concept underlies the entire book, Schlosser fails to note it explicitly until the epilogue. This makes Command and Control one of the rare books where I suggest reading the epilogue first, as the rest of the book is far more engaging when equipped with the appropriate framework for understanding the events taking place. Schlosser quotes Scott Sagan, professor of political science at Stanford and special assistant to the Joint Chiefs of Staff on strategic nuclear policy, that "Nuclear weapons may well have made deliberate war less likely, but the complex and tightly coupled nuclear arsenal we have constructed has simultaneously made accidental war more likely", a disquieting thought indeed.

When thinking of nuclear weapons, perhaps the most obvious instance of a complex system is the weapon itself. Schlosser shows his ground work through the entire book, but nowhere more than when discussing the engineering details that less insightful authors would have readily abstracted away. Armed with a battery of interviews with senior engineers from Los Alamos and Sandia National Laboratories, Schlosser gives not only the interesting details of exactly what lies within the belly of the bomb but, more importantly, an idea of how they can fail. Without a doubt, nuclear weapons - and US nuclear weapons in particular - are some of the most carefully engineered devices on the planet. Firewalls are placed, safety switches are set, key components of the firing system are physically separated to prevent them interacting. But the very nature of accidents is that they occur outside, often well outside, the remit of normal operation. Physical damage can smash together two circuit boards that were supposed to be air-gapped, crossing wires that were never supposed to meet. In extreme cases, one failure can defeat multiple safety systems. A fire sustained by jet fuel can not only melt the safety switches of a bomb, it can then discharge the capacitors, simulating a firing signal.

Above all, the lesson of normal accident theory is that complex systems rarely permit simple solutions. Knowing the risks of fire, the US Air Force adopted a policy of removing the locking pins from bombs during takeoff and landing, permitting them to be jettisoned away from the plane's burning wreckage during the most dangerous stages of flight. However, when Fortuna closes a door, Eris all too often opens a window, and the policy has resulted in the accidental bombing (with the nuclear cores removed, thankfully) of the US mainland on two occasions, as crew-members clutched at the manual release handle for support during turbulence.

Zooming out a level, Schlosser shows us that engineering is not an island (as much as every engineer I've ever met sorely wishes it was) and that these design choices are part of larger systems, both operational and political. Essential maintenance and upgrading of existing weapons falls by the wayside in a congressional environment where shiny new designs are the path to publicity, and poor maintenance on complex systems introduces all manner of unanticipated failure modes. In parallel, if the value of nuclear weapons lies in their capacity as a deterrent, they must be relied on to launch the moment they're needed, and this mindset accounts for a great deal of military intransigence to new safety features. In fact, for a full decade after locks were installed on the US ICBM fleet (a radical idea, I know), the Air Force insisted that they all be set to '00000000', in order to avoid the risk that crews would be stuck unable to launch if the need ever came.

In the words of the game theorist Oskar Morgenstern, "The human mind cannot construct something that is infallible... The laws of probability virtually guarantee [a nuclear] accident". As Schlosser notes, nuclear weapons have, thus far, had a 100% safety rate, at least as regards actual core detonation. The sobering companion to this statistic is that, by Schlosser's calculation, hundreds of thousands of people may live in the one-bomb gap between 100% and 99.99857%.

Every man, woman, and child lives under a nuclear sword of Damocles, hanging by the slenderest of threads, capable of being cut at any moment by accident or miscalculation or by madness.

Whilst Schlosser makes chillingly clear the impact of a failure in the electromechanical systems of nuclear destruction, he then notes that these risks pale in comparison to those inherent in their political systems. As the bomb became the centrepiece of cost-efficient military deterrence on both sides of the Cold War, the risk of extinction, accidental or deliberate, became a very real one for our species. True to the book's title, Schlosser talks in depth about the nuclear command systems of the United States, and the terrifying definiteness they possessed. In a recent article in POLITICO, Princeton researcher Bruce Blair notes the power of the presidency in times of nuclear crisis, going so far as to describe the office as that of a "nuclear monarch". To be sure, this is more a function of ballistics tahn of politics - the missile flight time between Russia and New York affords little time for sober democracy - but the result is the same. Of particlar concern is the US blueprint for armageddon, the Single Integrated Operational Plan (or SIOP). A plan so brutal that even Henry Kissinger, the quintessential realpolitik apparatchik, referred to it as a "horror strategy". In short, the SIOP provided the president with two options: launch nothing, or launch everything. Until a redraft under the Reagan administration, at the conclusion of the Cold War, limited strikes were not an option. At the command of the president, thousands of warheads would destroy every military base and civilian city in the USSR.

The logic behind the SIOP, popularly known as Mutually Assured Destruction has taken several names through its history. Massive retaliation. The atomic blitz. Peace through strength. The Samson option. It's likely thanks to this that we've managed to last nearly seventy years in the atomic bipolar age without an attempt at a first strike. The political scientist Elspeth Rostow even suggested that the bomb be awarded the Nobel Peace Prize. Callous though it may seem, the game theoretic foundations are as solid as the pillars of a Philistine temple, but with similarly tragic consequences when toppled. What Command and Control reveals is that this safety from intentional destruction may have been purchased at the cost of enormous risk of accidental annihilation, the inevitable consequence of normal accidents in vastly dangerous systems.

We have...made accidental war more likely

If a ballistic missile is a good example of a tightly coupled system, the holistic political-military-technological nuclear system is a brilliant one. When a faulty radar in Greenland mistakes the moon for a salvo of Soviet missiles or a broken chip confuses a test message for a launch warning, only minutes exist to determine whether the attack is real or not. Until The Cuban Missile Crisis, no hotline existed between Washington and Moscow, no potential to de-escalate a crisis. In order to defend against electronic hijacking, missiles can not be diverted or deactivated once launched. Once the ball starts to roll, it can not be stopped.

Like any wicked problem, the situation resists even the cleverest attempts to nullify it. Ronald Reagan's Star Wars program (the Strategic Defense Initiative, if you insist), was intended to defend against an attack, to give the United States more time to come to a sober decision when it mattered most. In reality, the program would never have been able to deal with more than a few dozen missiles, to defend against what remained following an American first strike, and would likely have actually increased the risk of war, with Soviet General Secretary Yuri Andropov going so far as to publicly call the plan 'insane' (ironic, given the fact that the Soviet Perimiter system operated through Andropov's entire term). For nations with limited stockpiles, credible deterrence relies on a broad distribution of weapons, impossible to wipe out in a single stroke, but more likely to be stolen by terrorists or insurgents. When belligerents are in proximity, the exact location of nuclear sites become vital in purchasing the time needed for a credible response. Anyone moderately familiar with the geography of Pakistan will know what parts of the country are furthest from the Indian border, and understand the risks of decentralised nuclear sites there. The rest of you will likely sleep easier if you don't figure it out.

Which brings this to the question of just how can we manage these dangers? In the famous essay Do Artifacts Have Politics, Langdon Winner wrote that nuclear weapons could never be managed with a democratic spirit, that "the internal social system of the bomb must be authoritarian; there is no other way". Nuclear weapons and everything that surrounds them are held in the tightest possible control by the governments that possess them. In the United States, nuclear secrets are "born secret", classified automatically by legislation, rather than by conventional review. In the world of cryptography, this is an idea known as "security by obscurity". The reputation of that idea in computer security circles goes a long way in explaining some of the more baffling items in Schlosser's catalogue of nuclear errors, and yet nobody in their right mind would suggest turning nuclear secrets, let alone control, to the general populace.

Thus the question of nuclear control remains, and becomes increasingly pressing. If more democracy if not the answer, perhaps the answer lies in the opposite direction. In 1946, 54% of Americans agreed that the United Nations should have sole control over nuclear weapons, including those of the US. A manifesto for international control, One World or None, became an international bestseller. Even the ardent pacifist Bertrand Russell argued that the UN should rain nuclear fire on the Soviet Union, and any other state which sought to develop weapons of its own.

That moment has passed now, and the multipolar order has taken over, sustained first by our fear, later our apathy. The question of how to manage it remains, but no good solutions exist. Command and Control is a first-rate work of history and a damn fine guide to complex risk, but its lasting value lies in Schlosser's chilling reminder that these weapons are more than simply curiosities of a different era. Command and Control may not be a book that interests everyone, but the world would be a safer, more sober place if it were.


The cover image is taken from U.S. Military footage of the Desert Rock IV exercise, conducted in parallel with the Dog test detonation of Operation Tumbler-Snapper. It shows U.S. Marines ordered to walk towards a 19 kiloton detonation, in order to test the effect that nuclear weapons would have on the psychological willingness of enemy soldiers. Many of these marines suffered severe radiation poisoning, as detailed in a brilliant piece by the Center for Investigative Reporting. Much like us, they incorrectly believed that nuclear weapons were safe.