This is the shorter of two versions of this post; the longer version is here

I updated on some points between completing the two posts, and the longer version reflects those updates. If I have time, I'll try to update this shorter version to match—but the differences are relatively minor (I think).

Summary

This post outlines how a small group of focused, committed individuals successfully reduced catastrophic risk from inside the US federal bureaucracy, and considers potential lessons from their experiences.

Before the mid 70s, US nuclear weapon design didn't fully account for the effects of “abnormal environments” like fires or plane crashes. Meanwhile, there were multiple “near-miss” accidents in exactly those types of environments, some of which came uncomfortably close to catastrophic detonations. Those, in turn, could have plausibly triggered all-out war if misinterpreted as intentional deployments.

Engineers at Sandia National Laboratories did intensive design and development work to address the problem between 1968 and 1972. But it took until 1990 before older weapons that didn’t meet the new safety standards were removed from US Quick Reaction Alert.

Why the delay? Stonewalling, evasion, and vested interests, both bureaucratic and military. Several of the Sandia engineers, led by Bob Peurifoy, advocated relentlessly over two decades for decision makers to see reason.

(Acronym sidebar: at this point, I’m introducing an acronym to refer to Bob Peurifoy and his allies: the SEAs (Sandia Engineer-Advocates). I’m not wedded to it, but I’ve found it a useful shorthand when writing and I suspect it might make for a smoother reading experience compared to the alternatives).

Key takeaways

With low-ish confidence in this topic (see Framing, just below), here are my main takeaways in terms of potential lessons and points of inspiration:

  • Peurifoy showed considerable personal agency in assuming the mantle of safety advocate as an engineer and manager at Sandia.
  • Relatedly, he and the SEAs (that’s the Sandia Engineer-Advocates) persisted over a long period despite slow progress and significant opposition.
  • The interventions of key individuals, including Senator John Glenn, were important catalysts for the SEAs’ eventual breakthrough. These individuals had credibility and influence, while also being less vulnerable to professional consequences than the SEAs.
  • When the SEAs eventually got the opportunity to advocate directly to those with the power to effect change, their many years of continued effort meant that they had all the arguments and evidence ready to make their case quickly and effectively.
  • Peurifoy was an expert weapon designer himself, combining strong technical and social skills.[1] His excellence enabled him to grow in seniority over time, increasing his influence.
  • The SEAs were a great team whose members used their collective strengths to push for change together.
  • The SEAs worked within the system and pushed at its boundaries—e.g. by using leverage to insist on safety features in new weapons, or by briefing the press.
  • It’s plausible that the SEAs could have achieved earlier or greater success with different approaches to strategy or communications. But they were working in a pretty intractable problem space—it’s likewise plausible that they just had to wait for the Overton window to shift in order to make their breakthrough (and that their advocacy may have contributed to that shift in part).

Framing and methodology

This is a low-ish confidence post. It’s the result of a few weeks’ research into most of the readily available sources; I’m not a trained journalist or historian, I’ve never worked for a government or a large bureaucracy, and I’m not an expert in nuclear weapons or any other content area covered here. Most of the main players are now dead, and many primary sources are still classified or otherwise hard to access. So it seems pretty likely that I’ve missed or misrepresented some key information.

Eric Schlosser’s Command and Control was my main entry point to this topic; I’ve referenced it a good amount and followed as many trails from its citations as I had time for. I also interviewed two individuals who were directly involved in the events: Gordon Moe and Stan Spray. There’s a collection of Bob Peurifoy’s personal papers at the Hoover Institute; I wasn’t able to visit this in person, but I did get copies of a few of the documents, referenced in the relevant footnotes.

Timeline of key events

(Not exhaustive—just for quick reference/visualization. Details and citations in main body of post.)[2] 

Year Event 
1960Goldsboro accident
1966Palomares accident
1968

Thule accident

Military Liaison Committee issues new weapon safety criteria (aka Walske criteria)

Sandia creates Nuclear Safety Department led by Bill Stevens and Independent Safety Assessment Group led by Stan Spray

1968-71Sandia engineers investigate abnormal environments, define ENDS principles
1972Sandia engineers investigate and recommend IHE (collaboration with Los Alamos Labs)
1973Bob Peurifoy promoted to Director of Weapon Development at Sandia, conducts stockpile safety review
1974Peurifoy & Glenn Fowler brief Sandia management and AEC, write Fowler letter
1975-90Burned Board Briefings: over 5,000 briefed
1976First bombs with ENDS enter stockpile
1979First bombs with ENDS and IHE enter stockpile
1980Grand Forks and Damascus accidents
1983Peurifoy promoted to VP of Technical Support at Sandia
1988Moe panel and report
1989Peurifoy briefs John Glenn, who escalates to Secretary of Energy
1990

Washington Post articles exposing safety issues

Cheney removes W69/SRAM-A from Quick Reaction Alert

Drell Panel report

1991Peurifoy retires

Problem space: US weapons safety and accidents up to 1968

Pre-70s, US nuclear weapons designs didn’t consistently account for handling errors or accidents,[3] and their electrical components could behave unpredictably in the extreme heat of a fire or lightning strike.[4]

The US military was typically resistant to safety requirements that they perceived as obstacles to quick and effective deployment,[5] resented civilian control of nuclear weapons, and resisted attempts to update deployed weapons that were already in their hands.[6]

But over a thousand weapons were involved in incidents or accidents between 1950 and 1968,[7] including at least 31 Broken Arrows (major incidents).[8] Three notable Broken Arrows were US Air Force crashes involving hydrogen bombs:

Goldsboro, North Carolina (1961)

The plane broke up during descent, three crewmen died, and its two bombs fell to the ground.[9] According to Sidney Drell, “just one switch in the arming sequence of one of the bombs, by remaining in its ‘off position’ while the aircraft was disintegrating, was all that prevented a full-yield nuclear explosion.”[10]

Palomares, southern Spain (1966)

Seven crew members died in a mid-air collision by the coast; three of four bombs fell on land, the fourth in the ocean.[11] Peurifoy wrote that “the danger of a nuclear detonation was similar to that in the Goldsboro accident”[12] While there was no nuclear yield,[13] the high explosives in two of the land-fallen bombs partially detonated, contaminating a village with plutonium.

Thule, Greenland (1968)

A plane crashed into the ice near a strategically-critical airbase, killing one crew member.[14] Again, there was no nuclear event, but the explosives in all four bombs detonated, dispersing plutonium for miles. Fred Iklé, former Under Secretary of Defence for Policy, believed that given Thule’s location and strategic importance, “if [there had been] a nuclear explosion beyond just a scattering of nuclear materials, we would have been very close to the edge of nuclear war by accident.”[15]

Design solutions, 1968-1972

In the aftermath of Thule, the Military Liaison Committee to the AEC[16] defined more stringent probability limits for premature detonations.[17] Sandia President Jack Howard appointed Bill Stevens as head of the new Nuclear Weapon Safety Department to meet those requirements,[18] while Stan Spray led Sandia’s Independent Safety Assessment Group, which “ruthlessly burned, scorched, baked, crushed, and tortured weapon components to find their potential flaws.”[19]

The urgency around safety wasn’t a given for everyone at Sandia yet. On first becoming department head, Stevens “wasn’t convinced that nuclear weapon accidents posed a grave threat to the United States,”[20] but reading the available accident reports “persuaded [him] that the safety of America’s nuclear weapons couldn’t be assumed.”[21]

The work of Stevens’s and Spray’s teams led to the definition of the three principles of ENDS (Enhanced Nuclear Detonation Safety):

  • Incompatibility—the signal used to arm the weapon before firing must be “unique relative to signals found in nature.”[22] Achieved via use of e.g. a unique signal generator.
  • Isolation—a weapon’s firing set and detonators must be isolated from unintended energy sources by physical barriers within the firing circuit, aka “strong links.”
  • Inoperability—“essential elements for detonating the warhead are designed to become inoperable...before the isolation features fail,”[23] aka “weak links”

Sandia also collaborated with Los Alamos Labs to research and recommend the use of insensitive high explosives (IHE) in 1972,[24] which are much harder to detonate than the conventional explosives used up to that point.[25]

The Fowler letter: 1973-1974

Peurifoy became director of weapon development at Sandia in September 1973 and, after reading through the Broken Arrow reports, concluded that the US was “living on borrowed time.”[26] Here’s what he did next:

  • I required that all new weapon designs use ENDS technology.
  • I recommended to my vice president, Glenn Fowler, that we push for a safety upgrade of weapons then in the stockpile.[27]

Peurifoy and Fowler briefed the unreceptive Sandia management team, whose anti-retrofit contingent argued that “recommending a retrofit would be a suggestion that Sandia had been imperfect, that new weapons….[would] eventually replace the older ones, and that a retrofit program would waste resources on the stockpile instead of…R&D.”[28]

Fowler and Peurifoy escalated to a senior AEC official in April 1974, who was also unmoved. So they put their safety concerns on the record in what became known as the “Fowler letter,” recommending the retrofit or retirement of many weapons in the current stockpile. “Fireworks erupted in Washington,” Peurifoy recounted, “because plausible deniability had been destroyed.”[29] Don Cotter, an AEC official and former Sandia engineer, was both unconvinced and offended. “It’s our stockpile. We think it’s safe. Who do you guys think you are?”[30]

Schlosser writes that “Fowler placed his career at risk” via the Fowler letter.[31] While he doesn’t cite or expand on this, Peurifoy wrote elsewhere that Fowler was “abused by Sandia, the AEC, and the DoD” in response to the letter.[32]

Burned boards and slow progress: 1975-1988

In 1975, the SEAs launched the “Burned Board briefings”, in which Stan Spray presented damaged hardware from his simulations of abnormal environments and the resulting impacts on weapon circuit behavior. They briefed thousands, both internally at Sandia and externally to senior military and government officials; Bill Stevens estimated an aggregate external audience of over 5,000 from 1975-1990.[33] Spray recalls that the listeners were typically receptive, but bureaucratic and budgetary stalling continued.[34]

Stevens also noted that “time and time again, [Peurifoy] sought and obtained a position as technical adviser in the series of major high-level studies,”[35] using those platforms to push for greater safety prioritization and better coordination between the Departments of Defense and Energy.[36]

The SEAs did make some progress in this period, seeing their safer designs rolled out in new weapons. The first bombs with ENDS entered the stockpile in 1976, and bombs with both ENDS and IHE followed in 1979.[37] Peurifoy and Stevens also used leverage when they could, refusing to sign new weapon releases until they had ENDS devices.[38]

The SEAs’ case was further strengthened by two significant accidents involving older, deployed weapons in the same week of September 1980:

September 15, Grand Forks, North Dakota

A B-52 bomber caught fire, but the wind blew the fire away from where its 12 nuclear missiles were loaded. An expert witness testified to a closed 1988 Senate hearing that if the wind had blown in the other direction, “[it] could probably have been worse than Chernobyl.”[39]

September 18-19, Damascus, Arkansas

An ICBM exploded in its silo following a fuel leak, propelling its nine-megaton warhead into the air, and destroying its launch complex. One Air Force serviceman died, and 21 people were injured.[40]

In 1983, Peurifoy became a Sandia VP, increasing his influence and reach. But there was still very little progress in phasing out older weapons of concern throughout the 80s.[41]

Breakthrough period: 1988-1991

In 1988, Peurifoy took part in a federal safety management review and helped to recruit Gordon Moe to serve as its chair. By then a private sector security consultant, Moe had started his career as a Sandia engineer before becoming a White House policy expert. His panel’s report used the W-69/SRAM-A weapon system as a damning case study in program management failures.[42]

Moe briefed his report to military and DOE officials, but was met with the same response the SEAs were used to getting.[43] But a stroke of luck got things moving in spring 1989—Ohio Senator John Glenn happened to be visiting Sandia,[44] and Peurifoy got the opportunity to give him a thirty-minute briefing. Glenn was immediately won over, and escalated to the Secretary of Energy, James D. Watkins.[45] In turn, Moe was summoned to brief Watkins and recalls his response:

“Oh God, that's terrible. That would be really embarrassing to the president” … He said he’d be seeing the Defense Secretary [Dick Cheney] the next day and he would discuss the concern with him.[46]

In spring 1990, more pressure came in the form of a succession of Washington Post articles for which Peurifoy was a key source.[47] Then, in June, Cheney temporarily ordered the SRAM-As out of service.[48] Meanwhile, the House Armed Services Committee engaged three leading physicists (the Drell Panel) to further investigate Moe’s findings. They released their report in December 1990, recommending the adoption of ENDS and IHE throughout the stockpile and vindicating the SEAs.[49]

By the time Bob Peurifoy retired in the spring of 1991,

his goals had been achieved…The changes in the stockpile that Peurifoy had sought for decades, once dismissed as costly and unnecessary, were now considered essential. Building a nuclear weapon without these safety features had become inconceivable.[50]

Lessons

Bill Stevens attributed the successful campaign to retire the W69/SRAM-A to "the roles of deep personal commitment to a belief, perseverance, knowing how the "system" really works, and the value of serendipity plus a leak to the media."[51] Many of those factors also apply to the whole, decades-long advocacy effort that preceded it.

Bob Peurifoy’s personal qualities and efforts stand out; Stan Spray believes that “without Bob Peurifoy, nothing would have happened,” and Gordon Moe agrees. “If not for Bob, nothing would have changed. When he got hold of something, he would not let go."[52]

Peurifoy’s agency and persistence are particularly noteworthy. He seems to have assumed safety as a personal responsibility, considerably beyond the scope of his various roles. He developed key relationships at Sandia and the Pentagon, and with reporters and activists, and advocated for more than 15 years without much apparent success and in the face of constant opposition.[53]

I haven’t come across any references to specific occasions when Peurifoy was at risk of losing his job, but it must have felt like a possibility at times (in his own mind, at least). He apparently received a negative performance rating in 1989—the year he briefed John Glenn[54]—and his references to “fireworks erupting” (above) and to Glenn Fowler being “abused” (ditto) suggest the type of responses he sometimes got for his advocacy efforts.

Stan Spray’s work is significant, too, in using the physical results of his research to bring the safety problems to life for the Burned Board audiences:

I had no authority—I had to convince people that we needed to change what we were doing. The way I could do that is through technical persuasion.[55]

And the interventions of Gordon Moe and John Glenn are instructive. Both had credibility and connections in government and science: Moe as an engineer and policy adviser in the Nixon and Ford administrations; Glenn as a sitting senator and former astronaut. Plus they were less vulnerable to repercussions for speaking their mind than the SEAs.[56]

The SEAs’ combination of slow-burn persistence and long-term readiness also seems important. Despite making limited progress between 1974-1988, it’s plausible that their continued advocacy efforts played a small part in shifting the Overton window in their favor (I assume other factors were more significant, like the changing geopolitical situation and the aging out of older weapons). It also meant that when their opportunity came to get in front of the right people, they had all the arguments and evidence ready to make their case effectively.

What mistakes might the SEAs have made? Schlosser writes:

Peurifoy told me, on many occasions, that he regrets not having been braver…He’d chosen to work within the system, despite his strong opposition to many of its practices. Although he was critical of the way in which official secrecy has been used to cover up mistakes, he’d honorably obeyed its code.[57]

It seems reasonable to speculate that the SEAs might have counterfactually achieved greater or quicker success with different approaches, but I’m pretty uncertain about this. Making a brief comparison to a somewhat analogous case: Matthew Meselson apparently began advocating against biological weapons around 1963;[58] by 1969, Nixon had renounced them, with Meselson’s advocacy a significant factor.[59] And, by his own account, Meselson was meticulous in his strategy and thought carefully about the best messaging for his target audience (the President and his advisers).[60]

I’ve only spent a little time looking into Meselson’s case, but some notable differences jump out between his and the SEAs’ advocacy situations. First, biological weapons were likely the more tractable cause area, perhaps significantly so; in Meselson’s words, “we didn’t need them…it was never something that the military liked.” Second, Meselson was largely independent, a celebrated academic working outside the system (save for some occasional government consulting gigs). Third, he made good use of his connections—he got his decisive access to Nixon via Kissinger, his former Harvard neighbor.[61]

The comparison with Meselson highlights some important aspects of the SEAs’ undertaking: they were working inside the system, on a pretty intractable national security problem, with no obvious social connections to the White House or Cabinet.[62] Their success was a long time coming and perhaps might have been quicker or more comprehensive; equally, it might also have been slower or lesser, given the challenges they faced.

Acknowledgments

My work on this post was funded via the Future Fund Regranting Program.

Many thanks to Gordon Moe and Stan Spray for agreeing to be interviewed and being so generous with their time, and to Ben West, Darius Meißner, Gordon Moe, Stan Spray, and Toby Jolly for reviewing drafts and giving feedback. All mistakes mine!


Notes

  1. ^

    A combination that Stefan Torges notes in transformative leaders like Oppenheimer here.

  2. ^

    Peurifoy has a similar timeline in Shulz & Drell (2012), p. 74. Google Books link

  3. ^

    According to Bob Peurifoy, “The risks associated with abnormal (accident) environments were not well recognized by Sandia senior management until 1968.” (Peurifoy in Shulz & Drell (2012), p. 70).

  4. ^

    “The charring of a circuit board could transform its fiberglass from an insulator into a conductor of electricity. The solder of a heat-sensitive fuse was supposed to melt when it reached a certain temperature, blocking the passage of current during a fire…[but it could] flow back into its original place, reconnect wires and allow current to travel between them.” (Schlosser (2013), p. 330)

  5. ^

    E.g. see Schlosser (2013), pp. 264-265, or Peurifoy in Always/Never Part 1, 22:27, “their focus was reliability, readiness—not accident safety.” 

  6. ^

    Former Sandia president Jack Howard recalled a typical military response after his demonstration of a critical new safety feature: “that’s an interesting solution, but we don’t have a problem that goes with it.”Always/Never Part 1, 20:30. Howard was referring to permissive action links, whose installation the military opposed over a long period—see Schlosser (2013), pp. 265, 298, 371, 440.

  7. ^
  8. ^

     A Broken Arrow is “any unplanned occurrence involving loss or destruction of, or serious damage to, nuclear weapons or their components which results in an actual or potential hazard to life or property.” (Kidder (1991), p. E1-2).

  9. ^
  10. ^

    Shulz & Drell (2012), p. 3. If that switch had also failed, lethal fallout could have spread as far north as New York City (Schlosser (2013), p. 247).

  11. ^
  12. ^

    Peurifoy in Shulz & Drell (2012), p. 68

  13. ^

    The weapons were designed to be “one-point safe” to prevent a full nuclear detonation in abnormal or unintended situations. The bomb’s nuclear core was encased in high explosives which had to be detonated at multiple points simultaneously for the bomb to work as designed. If the explosives were only detonated at one point, they’d consume themselves without transferring enough energy to cause a nuclear explosion. See also Always/Never Part 2, 15:27

  14. ^
  15. ^
  16. ^
  17. ^

    See Wolfgang (2012), pp. 6-7 

  18. ^

    Stevens (2001), pp. 90-91

  19. ^
  20. ^
  21. ^
  22. ^
  23. ^

    Drell in Shulz & Drell (2012), p. 44 

  24. ^

    Stevens, The Origins and Evoltions of S<sup>2</sup>C at Sandia National Laboratories, 1949 to 1996, SAND99-1308, Official Use Only (2001), p. 105 

  25. ^

    They would have prevented the detonation and plutonium dispersal at Palomares, for example, per Drell in Shulz & Drell (2012), p. 50. 

  26. ^
  27. ^

    Peurifoy in Shulz & Drell (2012), p. 70 

  28. ^

    Stevens (2001), p. 115 

  29. ^

    Peurifoy in Shulz & Drell (2012), p. 71. 

  30. ^
  31. ^
  32. ^

    Peurifoy’s draft of the article, ‘Nuclear Weapon Safety Issues,’ Oct 3-4, 2011, in Peurifoy papers, Hoover Institute

  33. ^

    Stevens (2001), p. 116 

  34. ^

    Interview with Stan Spray.

  35. ^

    Stevens (2001), p. 233 

  36. ^
  37. ^

    Peurifoy in Shulz & Drell (2012), p. 71 

  38. ^

    Schlosser (2013), p. 372, and Stevens (2001, p. 148). Peurifoy even snuck in additional features to an Air Force order for new control boxes (Schlosser (2013), p. 441). 

  39. ^

    Quoted by Senator William Cohen in a 1992 Senate debate, p. S11186

  40. ^
  41. ^

    In 1990, Sidney Drell testified to Congress that “the stockpile improvement program has been proceeding slowly with priority given to new weapons rather than [those] remaining in the stockpile.” (Drell testimony to Congress, December 18, 1990, p. 7) 

  42. ^

    "Attention to safety has waned, and we still have risks from weapons that will remain in the stockpile for years…It would be hard to overstate the consequences that a serious accident could have for national security." Shulz & Drell (2012), p. 72 

  43. ^

    “the military were not very pleasant when I briefed them, and civilians in the Pentagon were not much happier…I briefed it all around and nothing much happened." (Interview with Gordon Moe) 

  44. ^

    Bill Stevens referred to Glenn’s visit as “serendipity” (Stevens (2001), pp. 162, 164) 

  45. ^

    See Peurifoy in Shulz & Drell (2012), p. 72: “I gave him a thirty-minute safety briefing, using a picture of the Grand Forks fire, a display of the many safety briefings given to government officials (about 800), and the Moe study. The senator asked me what Admiral Watkins (then Secretary of Energy James D. Watkins) thought of my safety concerns. I said that Admiral Watkins did not know of my concerns. Senator Glenn then said, “I’ll be traveling with Admiral Watkins...next week, and I’ll discuss this with him.” 

  46. ^

    Interview with Gordon Moe 

  47. ^
  48. ^

    The removal was made permanent in December 1990 (Stevens (2001), p. 165). 

  49. ^

    See Peurifoy in Shulz & Drell (2012), pp. 72-73; Schlosser (2013), pp. 454-456; Stevens (2001), pp. 165-167 

  50. ^
  51. ^

    Stevens (2001), p. 162

  52. ^

    Interviews with Stan Spray and Gordon Moe, respectively. 

  53. ^

    In a letter sent right after Peurifoy’s retirement, his former manager Glenn Fowler summed up his contribution: “Your persistent attention to safety and reliability, despite somewhat uncomfortable circumstances at times, has been the major factor in the avoidance of a disastrous accident which would have changed the military and political posture of America.” Correspondence from Glenn Fowler to Bob Peurifoy, March 24, 1991, in Peurifoy papers, Hoover Institute

  54. ^

    Undated footnote by Bob Peurifoy in Peurifoy papers, Hoover Institute

  55. ^

    Interview with Stan Spray. 

  56. ^

    As Moe recalls, “there was really no way they could affect me…I was pretty much a free agent, and I think that's why Peurifoy decided to give me that job. [They thought] “he can maneuver around all that space, but he's not going to get sidelined by any particular agency or general.” (Interview with Gordon Moe). 

  57. ^
  58. ^
  59. ^
  60. ^
  61. ^
  62. ^

    Bob Peurifoy attended Texas A&M University, whose first alumnus Cabinet Secretary was appointed in 1992.

41

New Comment
2 comments, sorted by Click to highlight new comments since: Today at 2:18 AM

Thanks for writing this up, both as a lesson-learning exercise and just as an inspiring example.

It was cited a bunch of times in this post, but for anyone who missed it, I think Schlosser's book "Command and Control" (https://www.penguinrandomhouse.com/books/303337/command-and-control-by-eric-schlosser/) is a fascinating read for anyone concerned with the bureaucratic management of safety technologies and global catastrophic risk. Not always encouraging — though examples like Peurifoy are — but definitely educational.

Thanks! And +1 on Command and Control—there's also a documentary version, which focuses mostly on the events of the 1980 Damascus explosion. I recommend the documentary Always/Never—it was made by Sandia, so they gloss over/downplay some aspects relative to Schlosser (e.g. disagreements between the engineers and the military). But it's an accessible and fascinating (IMO) overview with a ton of first-hand Sandia accounts from e.g. Peurifoy, Spray, Stevens, Howard, as well as senior government and policy folks like Robert McNamara, James Schlesinger, Fred Iklé.