Always / Never

Eric Schlosser, Command and Control

Allen Lane, 656pp, £25.00, ISBN 9781846141485

reviewed by Will Wiles

A Titan II missile silo could kill you in more ways than you might expect. The nervous eye is naturally drawn to what the Titan II carried: a W-53 nuclear warhead with a yield of 9 megatons, ‘about three times the explosive force of of all the bombs dropped during the Second World War, including both atomic bombs.’ But try to forget about that for a moment. Consider instead what the warhead sat atop: a two-stage missile containing, in total, about 100,000 pounds of fuel and 200,000 pounds of oxidiser. Mixed together, these will explode, the reaction that would lift the missile out of the silo and off on its journey to its unhappy target.

Even unmixed, these substances were not to be trifled with. The fuel, Aerozine-50, ‘could spontaneously ignite when it came in to contact with everyday things like wool, rags, or rust.’ As a vapour in the air, it could explode at low concentrations. Inhaling the vapour or getting the fuel on your skin could cause a quick or lingering death. The oxidiser was, if anything worse: American federal law classified it as Poison A, ‘the most deadly category of man-made chemicals.’ It too could ignite in contact with leather, paper, cloth or wood. On the skin, it caused severe burns; inhaled it caused ‘headaches, dizziness, difficulty breathing, pneumonia, and pulmonary edema leading to death.’

And there was still more that could explode or catch fire, including the electro-explosive devices used to free the missile from its supports and separate the rocket stages, and the solid fuel rocket engines that adjust its course in flight. Finally, back to the warhead, which was triggered by high-explosive charges. If an accident detonated some of these charges, the conventional explosion could scatter radioactive plutonium over a wide area. If all were triggered at the same time, the result would be a nuclear explosion.

Even the slightest accident around a Titan II could have terrible consequences. On 18 September 1980, two Air Force technicians were conducting a routine procedure on a Titan in silo 374-7 near Damascus, Arkansas – correcting the pressure in the stage 2 oxidiser tank – when the socket dropped off the socket wrench Airman Jeffrey L Plumb was using to loosen a pressure cap.

Plumb watched the nine-pound socket slip through the narrow gap between the platform and the missile, fall about seventy feet, hit the thrust mount, and then ricochet off the Titan II. It seemed to happen in slow motion. A moment later, fuel sprayed from a hole in the missile like water from a garden hose.

“Oh man,” Plumb thought. “This is not good.”

Command and Control
, Eric Schlosser's third book, tells what happened next: a tense and complicated technological emergency with the capacity to turn into any number of horrific disasters: a subterranean inferno, a radioactive explosion, an Airborne Toxic Event, or even an unplanned nuclear detonation. But the 1980 ‘Damascus Incident’, in Schlosser's hands, also serves as a frame for a much larger story.

What we have is the history of America's nuclear arsenal, recast as a kind of philosophical epic set amidst the paradoxes and backwards logic that surround The Bomb. During the Second World War, the Manhattan Project poured formidable quantities of treasure and human brilliance into building a device capable of producing an atomic explosion. As soon as the first working atomic bomb was completed, there were fears it might explode at the wrong time. To create a runaway nuclear explosion, a spherical plutonium core had to be placed under extreme pressure, created by the detonation of the precisely shaped high-explosive charges that encased it. Firing all these charges at once was a fiddly matter, necessitating a new device called an ‘X-unit’. But the X-unit could also be fired by static electricity in the atmosphere. The night before the Trinity test, Donald Hornig, designer of the X-unit, spent nervous hours ‘babysitting’ the bomb in a ‘flimsy metal shed’ as a lightning storm raged around the test site.

Among the biggest surprises in a consistently surprising book is the chaotic and ramshackle state of America's ‘nuclear arsenal’ in its early years. Soviet conventional forces in Europe colossally outnumbered those of the USA and its allies. The American response to an invasion of the West would be an ‘atomic blitz’ of 50 bombs dropped on the USSR's main cities to immediately blast the country out of the war. But this ‘blitz’ existed only in the minds of the planners: the USAF didn't have the bomber fleet, or the bombs, to carry it out. Strategic Air Command only had 26 flight crews available, and more than half of those weren't expected to reach their target. Early long-range bombing exercises were farcical.

The man responsible for redeeming this situation was Curtis LeMay, a bomber veteran of World War Two and ruthlessly capable in management, tactics and strategy.

Promotions weren't given to individuals, but to an entire crew, sometimes on the spot. And when one person screwed up, the rest of the crew also paid the price. Officers lost their jobs because of accidents and honest mistakes. “I can't afford to differentiate between the incompetent and the unfortunate,” LeMay explained. “Standardization” became the watchword at SAC, repeated like a mantra and ruthlessly pursued, with manuals and checklists and numeric measures of success created for every job. Team players were rewarded, iconoclasts and prima donnas encouraged to go elsewhere. LeMay wanted SAC to function as smoothly as the intricate machinery of a modern bomber. “Every man a coupling or a tube; every organization a rampart of transistors, battery of condensers,” he wrote in his memoir. “All rubbed up, no corrosion. Alert.”

This remains perhaps the image of nuclear branches of the military in the United States and elsewhere: professional to an utmost, total degree unprecedented in history, befitting their mastery of the highest available technology and their role of enacting decisions critical to national survival; decisions, indeed, of species-level importance. Affectless but effective – a rarefied, super-extreme level of professionalism at the limits of what humans were capable of.

Very quickly, ‘too few bombs’ wasn't the problem any more. The ghastly Alice in Wonderland logic of the arms race set in, governed by the baroque, non-euclidean moral geometry nukes enfold about themselves. On practical and humanitarian grounds, the USA abandoned the ‘atomic blitz’ for a ‘no cities’ policy of strictly military targets, intended to knock out the means of retaliation in kind and force the Soviets to concede peace. But ‘no cities’ just led to endless escalation: more bombers and missiles on one side meaning more targets for the other, meaning more missiles and bombers, and so on forever. And the military hated ‘no cities’ – if there must be nuclear war, LeMay argued, why not fight it to win, using all means at your disposal? Why spare your enemy at all – why not let him be certain that your first move will be his obliteration? So in 1960 ‘no cities’ was abandoned and the atomic blitz was back on, under the sedative name Single Integrated Operational Plan (SIOP). The RAND Corporations built computer programmes that calculated the maximum devastation and casualties that could be inflicted on the Soviet Union and its people. And the SIOP grew and grew. By the 1970s it called for the USSR and its allies to be carpeted with 10,000 nuclear weapons – even Nixon and Kissinger were horrified at its extent.

Thousands of nukes – nukes in every corner, a nuke for every job, every problem solved by the application of more nukes. The Navy had its Polaris submarines, the Army experimented with nuclear mines and artillery, and developed the Davy Crockett, a nuclear rifle. On top of its missiles and bombers, the Air Force developed the Genie, a nuclear air-defence weapon intended to fry incoming enemy planes. Nukes piled up in, and circled above, America's allies, often without their people's knowledge or approval. Schlosser is a gifted researcher with a covetable knack for making a sprawling and complicated subject clear and urgent. At the heart of his history of this arsenal is a gripping paradox. America's nuclear stockpile had to be superlatively safe. But it could not be so bound up in safety measures that it could not be used when needed, very possibly at minutes' notice and in chaotic circumstances. This paradox was called ‘Always/Never.’ ‘Ideally, a nuclear weapon would always detonate when it was supposed to – and never detonate when it wasn't supposed to.’

Simple – but horribly complicated. And as the nukes multiplied, so did the accidents; dozens are described in Command and Control, from the blood-curdling to the comic. Safety measures proliferated – new switches, toggles, detectors, circuits, keycodes and locks – just as the hair trigger was made more and more sensitive. Bomber squadrons were kept on ‘airborne alert’, permanently armed and circling so they couldn't be destroyed on the ground if the event of a surprise attack, but at hugely increased the risk of plane crashes, which constitute the greatest share of the foul-ups Schlosser describes. Among the most serious of these was in 1961, when a B-52 out of Seymour Johnson Air Force Base in Goldsboro, North Carolina, went into an uncontrolled spin and broke apart. Schlosser relates a litany of safety mechanisms that failed; only one stopped a thermonuclear device being detonated near Faro, North Carolina.

As systems became more elaborate, they interacted in surreal and unexpected ways. The Thule monitoring station in northern Greenland was built to give utmost early warning of a Soviet attack, but the war planners believed it might be destroyed before it had a chance to alert anyone. So a bomber was kept circling above it to make sure it was still there – the Thule monitor. On 21 October 1968 a B-52 serving as Thule monitor caught fire and crashed. It was carrying four mark-28 nuclear weapons. Three square miles of ice, the territory of a US ally, were contaminated with radiation, necessitating a nightmarish clean-up in one of the most remote parts of the world. None of the bombs exploded, but if they had, Thule might have been destroyed. Schlosser notes:

[T]he partial detonation of a nuclear weapon or two, or three – without any warning, at the air base considered essential for the defense of the United States – could have been misinterpreted at SAC headquarters. Nobody expected the Thule monitor to destroy Thule.

The tone of dry understatement is typical – Schlosser remains pleasantly matter-of-fact even when the material he handles is sensational enough to make the reader's head spin. The slapstick qualities on display here and elsewhere – po-faced, belt-and-braces military thinking leading to it binding its shoelaces tightly together – are enough to make Stanley Kubrick's Dr Strangelove look like a documentary. Indeed Strangelove was relatively plausible: deliberate launch or misuse of weapons by an aggrieved or deranged serviceman (or rogue elements within a NATO ally, or Spetsnaz commandos) was one of the scenarios that most vexed America's nuclear custodians. Airborne alerts, the hairiest of all hair triggers, ended the day after the Thule crash.

Whatever the multifarious lethality of the loosely chained beast next door, it was safer in the silos than endlessly circling in the sky in ageing B-52s. Nevertheless there were plenty of missile-related accidents – notably a nightmarish fire at launch complex 373-4, outside Searcy, Arkansas, in August 1965, which killed 53 workers. The missile in 373-4 that day was, however, undamaged – and by a hardly believable quirk of fate, it was sitting on the launch mount at Damascus when Airman Plumb's socket wrench struck its side.

The story of what happened at Damascus is spread quite thinly through Command and Control, the meat of which is Schlosser's riveting history of the arsenal as a whole. Airmen Plumb and Powell left the silo in a hurry; in the launch control centre, behind a series of blast doors, alarms sounded and warning lights multiplied. The Damascus Incident sits in a curious subgenre of nerve-wracking history and fiction that evolved as mechanisation took command during the 20th century: a more-modern Prometheus we might call the breakdown tragedy. This kind of story has emerged from increasingly complex systems and the ideologies that govern them. Its cast is a new kind of man and woman created by the architects of these systems, the kind of technocratic super-professional Curtis LeMay compared to a precision component, ‘a coupling or a tube … All rubbed up, no corrosion. Alert.’

These precision beings, the sysman and the syswoman, find their natural home in the nuclear industries and the space programme, but also to a lesser extent in modern utilities, industry, and communications – wherever technological complexity and system organisation has crystallised to a particularly ornate degree and the (human, financial, environmental) stakes are high. A problem arises and escalates, turning into a cascade of problems. This could be related to an external factor, such as an earthquake or an attack, but in its purest form the problem must stem from the inner workings and weaknesses of the system itself. This crisis will, of course, be unforeseen – if it was foreseen, these consummate professionals would have no trouble handling it with checklists, first-rate training and keen instincts. As the instrument panels blaze with blinking alerts and the sirens wail and the bad news comes in from all angles, they are in the situation no one expected, the situation everything around them and within them has been structured to prevent: they don't know what to do. The situation, if it is remediable at all, will call on intellectual brilliance and pragmatic thinking under immense pressure, basic human bravery and an almost visceral understanding of the systems involved. In short, a new strain of modernist heroism.

JG Ballard, always prescient, was perhaps first to notice the unique poignancy of this kind of story. In his annotations to The Atrocity Exhibition, he mentions the Soviet cosmonaut Colonel Komarov, the first man to die in space. He was rumoured to have panicked at the controls, but Ballard disbelieves this, calling it ‘NASA black propaganda’. It would be out of keeping with the sysman mentality:

The courage of professional flight-crews under extreme pressure is clearly shown in The Black Box, edited by Malcolm MacPherson, which contains cockpit voice-recorder transcripts in the last moments before airline crashes. The supreme courage and stoicism shown by these men and women in the final seconds running up to their deaths, as they wrestle with the collapsing systems of their stricken aircraft, is a fine memorial to them, and a powerful argument for equal frankness in other areas.

In nonfiction (and dramatised nonfiction), other high-quality examples of this kind of ‘breakdown tragedy’ include the 1995 film Apollo 13 and Ablaze, Piers Paul Read's 1993 account of the 1986 explosion and fire at the Chernobyl nuclear power plant. In fiction, the late Michael Crichton made a career from assembling this kind of situation, with varying degrees of success.

There must be some relation between the breakdown tragedy and what the writer John Rogers has called ‘competence porn’ in film and television – the pleasure that comes, in shows such as House, Crime Scene Investigation and The West Wing, from watching brilliant and highly trained individuals solve devilish problems. But the breakdown tragedy is not always about the solving – it is about the moment a total system is revealed to be fatally flawed. In this respect it resembles classic human tragedy in which a hero is brought down by his or her inward human flaws – but the flaws come from the machine the heroes serve, which makes the situation no less tragic, and no less human.

Every day we are compelled to place our faith in complex systems, systems that keep the lights on and keep planes in the air and radionuclides out of it. Generally we prefer not to think about these systems too much, but to merely assume they are sophisticated – heroic, even – and staffed by able sysmen and women. This is modern mythmaking, as is the figure of the sysman himself. We want to believe that preternaturally calm, uber-competent people have their white-gloved fingers at the control panels we assume to be behind the infrastructure that surrounds us, and this belief is abetted by the bodies that govern those systems, who want our confidence rather than our scrutiny. (As is only natural.) The great power of Eric Schlosser's excellent book is in its exploration of the mysterious space between the ideological myths around nuclear weapons and the technological and human reality. It is a remarkable story of human endeavour, and a luminous picture of modernity.
Will Wiles is a freelance writer and the author of a novel, Care of Wooden Floors. His second novel, The Way Inn, will be published by Fourth Estate in June 2014.