I'm landfish, aka Jeffrey Ladish. I'm a security researcher and risk consultant focused on global catastrophic threats. My website is at https://jeffreyladish.com
Yeah, I would agree with that! I think radiological weapons are some of the most relevant nuclear capabilities / risks to consider from a longterm perspective, due to their risk of being developed in the future.
The part I added was:"By a full-scale war, I mean a nuclear exchange between major world powers, such as the US, Russia, and China, using the complete arsenals of each country. The total number of warheads today (14,000) is significantly smaller than during the height of the cold war (70,000). While extinction from nuclear war is unlikely today, it may become more likely if significantly more warheads are deployed or if designs of weapons change significantly."I also think indirect extinction from nuclear war is unlikely, but I would like to address this more in a future post. I disagree that additional clarifications are needed. I think people made these points clearly in the comments, and that anyone motivated to investigate this area seriously can read those. If you want to try to doublecrux on why we disagree here I'd be up for that, though on a call might be preferable for saving time.
Thanks for this perspective!
I mean that the amount required to cover every part of the Earth's surface would serve no military purpose. Or rather, it might enhance one's deterrent a little bit, but it would1) kill all of one's own people, which is the opposite of a defense objective2) not be a very cost effective way to improve one's deterrent. In nearly all cases it would make more sense to expand second strike capabilities by adding more submarines, mobile missile launchers, or other stealth second strike weapons.Which isn't to say this couldn't happen! Military research teams have proposed crazy plans like this before. I'm just arguing, as have many others at RAND and elsewhere, that a doomsday machine isn't a good deterrent, compared to the other options that exist (and given the extraordinary downside risks).
FWIW, my guess is that you're already planning to do this, but I think it could be valuable to carefully consider information hazards before publishing on this [both because of messaging issues similar to the one we discussed here and potentially on the substance, e.g. unclear if it'd be good to describe in detail "here is how this combination of different hazards could kill everyone"]. So I think e.g. asking a bunch of people what they think prior to publication could be good. (I'd be happy to review a post prior to publication, though I'm not sure if I'm particularly qualified.)
Yes, I was planning to get review prior to publishing this. In general when it comes to risks from biotechnology, I'm trying to follow the principles we developed here: https://www.lesswrong.com/posts/ygFc4caQ6Nws62dSW/bioinfohazards I'd be excited to see, or help workshop, better guidance for navigating information hazards in this space in the future.
This may be in the Brookings estimate, which I haven't read yet, but I wonder how much cost disease + reduction in nuclear force has affected the cost per warhead / missile. My understanding is that many military weapon systems get much more expensive over time for reasons I don't well understand.Warheads could be altered to increase the duration of radiation effects from fallout, but this would would also reduce their yield, and would represent a pretty large change in strategy. We've gone 70 years without such weapons, which the recent Russian submersible system as a possible exception. It seems unlikely such a shift in strategy will occur in the next 70 years, but like 3% unlikely rather than really unlikely.It's a good point that risks of extinction could get significantly worse if different/more nuclear weapons were built & deployed, and combined with other WMDs. And the existence of 70k+ weapons in the cold war presents a decent outside view argument that we might see that many in the future. I'll edit the post to clarify that I mean present and not future risks from nuclear war.
I think I gave the impression that I'm making a more expansive claim than I actually mean to make, and will edit the post to clarify this. The main reason I wanted to write this post is that a lot of people, including a number in the EA community, start with the conception that a nuclear war is relatively likely to kill everyone, either for nebulous reason or because of nuclear winter specifically. I know most people who've examined it know this is wrong, but I wanted that information to be laid out pretty clearly, so someone could get a summary of this argument. I think that's just the beginning in assessing existential risk from nuclear war, and I really wouldn't want people to read my post and walk away thinking "nuclear war is nothing to worry about from a longtermist perspective." I agree that "We know that one type of existential risk from nuclear war is very small, but we don't really have a good idea for how large total existential risk from nuclear war". I'm planning to follow this post with a discussion of existential risks from compounding risks like nuclear war, climate change, biotech accidents, bioweapons, & others.It feels like I disagree with you on the likelihood that a collapse induced by nuclear war would lead to permanent loss of humanity's potential / eventual extinction. I currently think humans would retain the most significant basic survival technologies following a collapse and then reacquire lost technological capacities relatively quickly. (I discussed this investigation here though not in depth). I'm planning too write this up as part of my compounding risks post or as a separate one.Agreed that it's very hard to know the sign on a huge history-altering event, whether it's a nuclear war or covid.