Key points from The Dead Hand, David E. Hoffman

by Kit7 min read9th Aug 20198 comments


Nuclear WeaponsEA BooksExistential risk

The Dead Hand recounts high-level decision-making around nuclear and biological weapons, and how nations engaged in arms races and arms control, particularly the USA and USSR during the eras of Reagan and Gorbachev.

I summarised the parts I found most insightful in order to clarify my own understanding. I post these extremely compressed notes here in the hope that they will introduce you to some new concepts and/or help you decide whether to read the book. I usually do not attempt to verify facts. (Interpretations I make are in brackets.)

Strategic misunderstandings

Through much of the Cold War, USSR leadership believed that the USA might launch a surprise nuclear attack on the USSR. US leadership considered this out of the question, in large part because it seemed implausible that an aggressor could ‘win’ a nuclear war, though some USSR and US generals did believe in the idea of ‘winning’ a nuclear war. Further, when NATO spies got hold of documents written by USSR leadership detailing a project to notice signs of the USA preparing a first strike, key US figures thought it was more likely that this was part of a propaganda campaign against the USA or against the intermediate-range Pershing missiles being stationed in Europe than that USSR leadership really thought the USA might launch a first strike.

Poor understanding was sometimes mistaken for bad faith. In a proposal to reduce the number of long-range nuclear weapons, Reagan included the ask that, for both the USA and the USSR, ‘no more than half of those warheads be land-based’. This seemed to him like a great idea, reducing danger to both countries. Brezhnev saw this as hard to take in good faith because the USSR was much more dependent on land-based missiles than the USA. (This was a very large detail to be ignorant of, and I think it would have seemed unlikely to me that this proposal was in good faith even with my basic knowledge of the situation.)

Some US political leaders hypothetically wanted to crush the USSR, or communism, but wanted peace much more. (Reconciling various statements, e.g. reconciling fiery speeches about the badness of communism with letters declaring a full commitment to peace, seems like it would have been hard for USSR leadership.)

The accuracy of CIA information was often poor, regarding both weapons activity and the attitudes of USSR leaders. USSR intelligence about the US economy was particularly awful, with wild overestimates of how much of the economy was military.

The book says ‘X was a missed opportunity’ a number of times regarding times when agreements on arms control could potentially have been reached. (It is hard to tell what the counterfactual really was.)

Nuclear escalation

The USSR's attack-detection systems were not robust, with a false perception of an incoming US missile attack on the watch of Stanislav Petrov being a key example. The book suggests that military leaders rushed the systems into deployment (though it's not obvious what the alternative was). The forces on the ground had to handle a wide range of issues, though they were well aware that there was a high false-positive rate since many signs of missile launches were flagged by the computer systems for inspection by the operators every day. (The Petrov situation was more extreme, with the automated systems stating high credence in a missile attack. However, the full description of the situation leaves me thinking that if someone other than Petrov had been there, the false alarm would probably still have been treated as probably a false alarm by the operations team, leading to either roughly the same course of action, or to passing the warning up the chain of command but making clear that it was probably a false alarm and this leading to no extreme actions.)

Systems failed in ways that both increased and decreased the likelihood of escalation. At one point, an erroneous message clearly instructed the entire USSR nuclear missile forces to move to higher alert, and only one team did so, with many of the others calling their superiors to question whether the message was correct.

Tangentially, the Chernobyl accident took days to be taken seriously by USSR leaders, possibly in part due to people not wanting to pass bad news up the command chain and a general lack of ownership/responsibility.

Most political leaders pretty strongly wanted to avoid dying. Also, the USSR was very concerned specifically about a 'decapitation' strike which would hit Moscow and prevent retaliation. (These concerns may or may not have been significantly linked.) Perimeter, a USSR project which automated substantial parts of the command chain and allowed for launch instructions to be relayed even if communications infrastructure was destroyed, would have enabled a retaliatory strike even in the case of a decapitation strike. (My impression is that Perimeter was put in place largely to mitigate the decapitation concern, though I'm not clear on that.) Possibly in contradiction to claims by Daniel Ellsberg that the Dead Hand – automatic launch of nuclear weapons in certain circumstances – was and remains deployed, the Dead Hand proper was rejected by the USSR military in favour of Perimeter, which kept some humans in the loop.

Missile defence

Key actors in the USA and USSR had wildly different perceptions of the implications of missile defence, the ability to shoot down intercontinental ballistic missiles with interceptor missiles. Reagan and some fraction of US leaders dreamed of a world where missile defence would render all nuclear weapons ineffective. Gorbachev and most USSR leaders saw missile defence as a means for the USA to obtain secure first-strike capabilities. The USA ceasing missile defence research was often a top-priority demand for the USSR during negotiations. (This research seems to have been compliant with the letter but perhaps not the spirit of the Anti-Ballistic Missile Treaty.)

Lots of people thought missile defence was impossible with technology at the time. (It remains largely ineffective today.)

Principal-agent problems

Some generals wanted to visibly flex military muscles, including sending naval vessels or warplanes into USSR territory. Such exercises were not always known to political leaders. In at least one case, the lack of knowledge seems to have led to greater confusion about why the USSR felt threatened by the USA. Successful incursions by US aircraft put pressure on USSR forces to respond faster to potential threats. This may have contributed significantly to the downing of an off-course Korean Air Lines flight by a USSR military aircraft, which was an international incident.

A large USSR bioweapons research group worked on making a more virulent version of smallpox. (It is unclear to me how this came about. My vague guess is that high-level people asked particular scientists for bioweapons, and those scientists decided to try to enhance smallpox for reasons not particularly correlated with what makes the USSR safer.) A lead scientist at one point set the goal of being able to produce one new pathogen per month. (It's unclear whether this, too, was a useful goal for the USSR.)

Monitoring biological weapons activity

The US bioweapons programme was shut down before the Biological Weapons Convention was signed. The USSR had thousands of people working on bioweapons, in some cases with little consideration given to the treaty. As one example, Viktor Zhdanov, who championed the effort to eradicate smallpox, led a high-level council overseeing bioweapons work. Throughout the Cold War, USSR bioweapons scientists typically believed that the USA was also developing bioweapons illegally.

Anthrax killed around 100 people in Sverdlovsk in 1979. This drew international attention, with NATO diplomats questioning whether this was due to a leak of biological weapons many times over the years. USSR authorities consistently blamed the incident on contaminated meat, and a single remark by Yeltsin is the only public, semi-official recognition that it was, in fact, a biological weapons leak. The cause was not confirmed by outsiders until location data on the victims was available. (The ability to cover up or at least maintain plausible deniability of the true cause was very surprising to me.) USSR experts met other experts and presented at conferences about the outbreak, lying, in many cases very convincingly, to >200 scientists and arms control experts.

Western experts thought bioweapons were not very useful to a state with nuclear weapons, so they thought the USSR would not want them. This belief was held simultaneously to USSR bioweapons programmes running at their full scale.

Generally, it is hard to figure out whether bioweapons activity is happening within a facility, though identifying buildings which might be being used for large-scale production was possible.

The defection of Pasechnik, the first person with a breadth of knowledge of USSR bioweapons programmes to defect, was a very big deal. The information was discussed in private, including with some USSR leaders. Going public with much more concrete information about large-scale biological weapons efforts would have crippled progress on nuclear issues, in particular by destroying support from Congress for cooperation.

Pasechnik defected upon increasingly seeing his own work as harmful, and was scared of what the British might do to him since he thought he could be seen as a war criminal. (He seems very brave.)

By the time Yeltsin gained power, NATO leaders believed that the USSR had a large biological weapons programme. Soon after gaining power, Yeltsin admitted the bioweapons programme and pledged to shut it down quickly. The generals managed to keep it alive. Yeltsin was not in control. Also, Gorbachev had hated the threat of nuclear weapons, so it seems likely that he, like Yeltsin after him, would also have wished to shut down the biological weapons programme. Possibly this was too many battles to pick with the military-industrial complex at the same time.

(Also: ‘military-industrial complex’ is a useful, meaningful concept, at least when thinking about the USSR.)

It was sometimes useful for NATO diplomats to tell the USSR/Russia what they knew about secret programmes.

Visits to verify secret sites seemed very important. For example, visits by USSR scientists to US sites suspected by USSR intelligence of being biological weapons sites convinced Ken Alibek, who later defected, that the sites were not conducting weapons development.

Strategic/battlefield weapon distinction

This seems like a very useful distinction. Some biological weapons (e.g. anthrax) could be used against an army – used as battlefield weapons. Some are designed to kill indiscriminately – strategic weapons. This may make strategic biological weapons a ‘poor man's atomic bomb’.

His eyes were riveted on one word on the page, “plague.” This was the moment when the UK’s biological warfare specialist (reading information from Pasechnik) realised that the USSR was developing strategic biological weapons, apparently not a widely considered possibility beforehand. (It seems to me that a lot of what Pasechnik had to say would have sounded implausible at first hearing, even though it was highly accurate.)

Proliferation risks during the breakdown of the USSR

Soviet government departments tried to go into business including selling nuclear explosions (for civilian purposes such as digging canals). Chetek, a business, would do the research and designing of detonations, and the government would conduct the detonation. This did not happen because of a continuing nuclear test ban.

Weapons scientists became very poor and desperate. The government dramatically cut wages for many people and was often not even paying these reduced wages reliably. A major intervention conducted by US diplomats and scientists was to set up institutes to help them move to civilian work. Getting a weapons scientist a grant to do civilian research or engineering could lift them out of poverty and prevent some of them from selling weapons technology. Scientists wanted to work on something meaningful, and recognising that and winning their trust was important to get them on board with these programmes.

People were keen, and not afraid, to discover and share the secrets of the arms race. Perhaps this apparent sudden shift was due to the breakdown of the police state / strict control on the actions of individuals, and the lack of other mechanisms for discouraging bad behaviour.

Individuals and small groups of scientists began to discuss selling enriched uranium with other states that wanted nuclear weapons. This was greatly concerning to NATO since the main limit to a rogue state being able to build a bomb was having fissile material. (I’m a bit unclear about this: it seems that this is true for dirty bombs, but it is extremely hard to make a fission bomb work.)

A facility in Kazakhstan arranged to ship beryllium to Iran, and only failed to do so due to a paperwork glitch. The Kazakh government consented to a US operation to extract uranium before it could be sold to Iran. This appeared to be a very dangerous operation, involving driving trucks in icy conditions and the longest C-5 flights in history. (Presumably transporting enriched uranium this way was considered preferable to letting it be bought by rogue states.) Once sting operations on illegal sales succeeded, this raised the profile of the problem, gaining further support for these efforts. Previously, Congress had been unwilling to spend money helping former USSR states dispose of weapons material.

Theft and illegal sales were a high risk in part because fissile material was stored in poor conditions with low security. The weakest security was often for enriched uranium intended for civilian use. In one case, staff at a storage facility stole fissile material using only a crowbar and a hacksaw.

Thanks to Claire Zabel and Andrew Snyder-Beattie for recommending the book, and to Sim Dhaliwal and Ollie Base for suggesting clarifications.


8 comments, sorted by Highlighting new comments since Today at 2:58 AM
New Comment

Just wanted to say I thought this post was great and really appreciate you writing it! I have a hard-to-feed hunger to know what the real situation with nuclear weapons is like, and this is one of the only things to touch it in the past few years. Any other resources you'd recommend?

I'm surprised and heartened to hear some evidence against the "Petrov singlehandedly saved the world" narrative. Is there somewhere I can learn about the other nuclear 'close calls' described in the book? (should I just read the book?)

Thanks! Here are some places you might start. (People who have done deeper dives into nuclear risk might have more informed views on what resources would be useful.)

  • Baum et al., 2018, A Model For The Probability Of Nuclear War makes use of a more comprehensive list of (possible) close calls than I've seen elsewhere.
  • FLI's timeline of close calls is a more (less?) fun display, which links on to more detailed sources. Note that many of the sources are advocacy groups, and they have a certain spin.
  • Picking a few case studies that seemed important and following the citations to the most direct historical accounts to better understand how close a call they really were might be a project which would interest you.
  • I thought this interview with Samantha Neakrase of the Nuclear Threat Initiative was helpful for understanding what things people in the nuclear security community worry about today.

Some broader resources

  • The probability of nuclear war is only one piece of the puzzle – even a nuclear war would probably not end the world, thankfully. I found the recent Rethink Priorities nuclear risk series (#1, #2, #3, #4, #5, especially #4) very helpful for putting more of the pieces together.
  • This Q&A with climate scientist Luke Oman gets across some key considerations very efficiently.

I'm also glad that you interpret the discussion of the Petrov incident as 'some evidence against'. That's about the level of confidence I intended to convey.

I recently started to feel that celebrating Petrov was a bad choice: he just happened to be in the right place in the right time, and as you say, there were many false positives at the time. Petrov's actions were important, but they provide no lessons to those who aspire to reduce x-risk.

A better example might be Carl Sagan, who (if I'm correct) researched nuclear winter and succesfully advocated against nuclear weapons by conveying the risk of nuclear winter. This seemed to have contributed to Gorbachov's conviction to mitigate nuclear war risk. This story has many components EA cares about: doing research to figure out the impacts, advocating with good epistemics, knowing that making an impact is complex, having a strong vision about what is valuable, searching for cooperation, and effectively changing the minds of influential actors.

[Stumbling upon this a year late and sharing a low-confidence hot take, based mostly on Wikipedia

I think Carl Sagan's research and advocacy on nuclear winter is definitely an interesting example to consider, but I'm not sure it's one we should aim to emulate (at least not in its entirety). And I currently have the impression that he probably did not have good epistemics when doing this work. 

My impression is that: 

  • Scientists seem quite divided on how likely nuclear winter would be, and what its consequences would be, given various possible nuclear exchanges
  • Some people seem to think the early study Sagan was involved with deliberately erred towards alarmism in order to advance the cause of disarmament
  • Evidence from Kuwait oil well fires seems to have not matched the predictions of that study

(I'm hoping to learn more about nuclear winter in the coming months, and would probably have more substantive things to say at that point.)

One reason the Sagan example may be interesting is that it could help us think about how to make - or find ways to avoid having to make - tradeoffs between maintaining good epistemics and influencing things in high-profile, sensitive political areas. 

Good points! I broadly agree with your assessment Michael! I'm not at all sure how to judge whether Sagan's alarmism was intentionally exaggerated or the result of unintentional poor methodology. And then, I think we need to admit that he was making the argument in a (supposedly) pretty impoverished research landscape on topics such as this. It's only expected that researchers in a new field make mistakes that seem naive once the field is further developed.

I stand by my original point to celebrate Sagan > Petrov though. I'd rather celebrate (and learn from) someone who acted pretty effectively even though it was flawed in a complex situation, than someone who happened to be in the right place at the right time. I'm sill incredibly impressed by Petrov though! It's just.. hard to replicate his impact.

Thanks for this post! I hadn't heard of this book before, but the parts you summarised seemed interesting, so I've now downloaded it and will listen to it soon.

I've also just written a question post to request recommendations of books relevant to nuclear risk, WMDs, and/or great power war, and there I mentioned this book and this post, in case other people would also benefit from being pointed in this direction.

I’m a bit unclear about this: it seems that this is true for dirty bombs, but it is extremely hard to make a fission bomb work.

I'm far from an expert, but Global Catastrophic Risks makes it sound like that's not the case:

With modern weapons-grade uranium, the background neutron rate is so low that terrorists, if they have such material, would have a good chance of setting off a high­ yield explosion simply by dropping one half of the material onto the other half. Most people seem unaware that if separated HEU is at hand it's a trivial job to set off a nuclear explosion ... even a high school kid could make a bomb in short order.

(the book is actually quoting Luis Alvarez there)

A US government sponsored experiment in the 1960s suggests that several physics graduates without prior experience with nuclear weapons and with access to only unclassified information could design a workable implosion type bomb. The participants in the experiment pursued an implosion design because they decided a gun-type device was too simple and not enough of a challenge (Stober, 2003).

I stand corrected. I think those quotes overstate matters a decent amount, but indeed the security of fissile material is a significantly more important barrier to misuse.