Sometimes I wonder if people just don’t realize what very likely is in store in the future.

What life would be like after a nuclear war. What would constitute day to day existence when hundreds of millions of humans are desperately migrating to escape climate induced famine. Such things aren’t polite topics. They are, however, necessary topics.

The bulletin of the Atomic Scientists argument in favor of quantifying nuclear risk shares the following from Martin Hellman from Stanford University: 

"A risk of one percent per year would accumulate to worse-than-even odds over the lifetime of a child born today. Even if someone were to estimate that the lower bound should be 0.1 percent per year, that would be unacceptably high—that child would have an almost ten percent risk of experiencing nuclear devastation over his or her lifetime.'

The past half decade has seen a dramatic erosion in the taboo around threatening the use of nuclear weapons, even as they have proliferated across the globe. This threat will only be compounded and accelerated in the coming years as technological advance and proliferation continue to expand the risk of nuclear Armageddon. 

"Imagine that a man wearing a TNT vest were to sit down next to you and tell you that he wasn’t a suicide bomber. Rather, there are two buttons for setting off his explosive vest. One was in the White House with Trump for the last four years, and recently was given to Biden. The other is with Putin in Moscow. You’d still get away as fast as you can! " - Vinton Cerf

Yet when I look at Open Philanthropy's grants for nuclear deterrence, all I see are a few tangentially related longtermist grants on biosecurity. The term nuclear seems to just come in where organizations like Nuclear Threat Initiative work on biosecurity grants. Why is this the case? What is the logic there? This seems like an obvious blind spot.

I wonder if EA is subject to the same myopia and complacency that's gripped our civilization since the end of the Cold War. The nuclear threat feels dated and tired. Yet a clear eyed analysis shows that the threat has only grown. 

The war in Ukraine may very well be the start of World War Three. Do any of us have the right to be surprised? What then is the EA response to this threat? Worship at the cult of the coming AGI singularity and escape into cyberspace? Pat ourselves on the backs for the number of bednets we've funded? I'm being ungenerous in my rhetoric but what seriously are we doing people? 

What is preventing us from showing much greater leadership and courage in the face of what may very well be the end of humanity? Refuges and bunkers are great. Why not more? Why not take a more proactive role? Why not work to address the underlying threat at its source? 

1

0
0

Reactions

0
0
Comments5


Sorted by Click to highlight new comments since:

I think this is caused by two factors, 1) longtermism has decided to distinguish between existential risks and global catastrophic risks, and 2) there is a much stronger general culture of denial around nuclear risk than any other risk. 

When in the Precipice(which I take to have been broadly read) Toby Ord says that AI is a 1/10 risk and nuclear war and climate change are both 1/1000 and does so essentially on the basis that he views the later two as survivable this created the justification to write them off.(this isn't to say this is where the trend began)

If you talk to most people they just don't believe nuclear war is likely, as a result, it seems like nuclear war and unaligned artificial intelligence are both treated as being speculative, except nuclear war has a long history of not happening. This is of course nonsense, as nuclear weapons exist right now.

Yes 1000% on the cultural factors that have desensitized us to nuclear risk. Tyler Cowen has a nice series of posts out today on this subject: https://marginalrevolution.com/marginalrevolution/2022/08/which-is-the-hingy-est-century.html

I might find time to go back and reread the Precipice and dig into the probabilities you reference. Those seem odd. It's also odd because something that reduces humanity to subsistence levels for a very long time and eliminates ninety some odd percent of the population is absolutely catastrophic. I suppose I'm a hyberbolic discounter at heart and do think that while we should care about the far future, it's really silly to get into the 1-1 logic that a human a billion years from now should be equally valued for decision making as one today or ten years from now. 

I'll check out the article. 

You can find the numbers in the Risk Landscape section. Yeah, it's wonky. It seems very odd to me to both A) be confident that nuclear risk won't kill everyone but will just kill 90% of people and B) that this would strongly influence which risks you worry about. 

Also, you might be interested in the submission I just made to the contest about many worlds and nuclear risk, as I am also begging EA to care about nuclear risk. https://forum.effectivealtruism.org/posts/Gg2YsjGe3oahw2kxE/nuclear-fine-tuning-how-many-worlds-have-been-destroyed?

Thanks for sharing. I'll check out your post. 

Do you know how one might get another copy of the Precipice? I donated mine to a friend.

If you sign up for the 80,000 hours newsletter you can get one for free, it is also on libgen

Curated and popular this week
 ·  · 2m read
 · 
I can’t recall the last time I read a book in one sitting, but that’s what happened with Moral Ambition by bestselling author Rutger Bregman. I read the German edition, though it’s also available in Dutch (see James Herbert's Quick Take). An English release is slated for May. The book opens with the statement: “The greatest waste of our times is the waste of talent.” From there, Bregman builds a compelling case for privileged individuals to leave their “bullshit jobs” and tackle the world’s most pressing challenges. He weaves together narratives spanning historical movements like abolitionism, suffrage, and civil rights through to contemporary initiatives such as Against Malaria Foundation, Charity Entrepreneurship, LEEP, and the Shrimp Welfare Project. If you’ve been engaged with EA ideas, much of this will sound familiar, but I initially didn’t expect to enjoy the book as much as I did. However, Bregman’s skill as a storyteller and his knack for balancing theory and narrative make Moral Ambition a fascinating read. He reframes EA concepts in a more accessible way, such as replacing “counterfactuals” with the sports acronym “VORP” (Value Over Replacement Player). His use of stories and examples, paired with over 500 footnotes for details, makes the book approachable without sacrificing depth. I had some initial reservations. The book draws heavily on examples from the EA community but rarely engages directly with the movement, mentioning EA mainly in the context of FTX. The final chapter also promotes Bregman’s own initiative, The School for Moral Ambition. However, the school’s values closely align with core EA principles. The ITN framework and pitches for major EA cause areas are in the book, albeit with varying levels of depth. Having finished the book, I can appreciate its approach. Moral Ambition feels like a more pragmatic, less theory-heavy version of EA. The School for Moral Ambition has attracted better-known figures in Germany, such as the political e
MarieF🔸
 ·  · 4m read
 · 
Summary * After >2 years at Hi-Med, I have decided to step down from my role. * This allows me to complete my medical residency for long-term career resilience, whilst still allowing part-time flexibility for direct charity work. It also allows me to donate more again. * Hi-Med is now looking to appoint its next Executive Director; the application deadline is 26 January 2025. * I will join Hi-Med’s governing board once we have appointed the next Executive Director. Before the role When I graduated from medical school in 2017, I had already started to give 10% of my income to effective charities, but I was unsure as to how I could best use my medical degree to make this world a better place. After dipping my toe into nonprofit fundraising (with Doctors Without Borders) and working in a medical career-related start-up to upskill, a talk given by Dixon Chibanda at EAG London 2018 deeply inspired me. I formed a rough plan to later found an organisation that would teach Post-traumatic stress disorder (PTSD)-specific psychotherapeutic techniques to lay people to make evidence-based treatment of PTSD scalable. I started my medical residency in psychosomatic medicine in 2019, working for a specialised clinic for PTSD treatment until 2021, then rotated to child and adolescent psychiatry for a year and was half a year into the continuation of my specialisation training at a third hospital, when Akhil Bansal, whom I met at a recent EAG in London, reached out and encouraged me to apply for the ED position at Hi-Med - an organisation that I knew through my participation in their introductory fellowship (an academic paper about the outcomes of this first cohort can be found here). I seized the opportunity, applied, was offered the position, and started working full-time in November 2022.  During the role I feel truly privileged to have had the opportunity to lead High Impact Medicine for the past two years. My learning curve was steep - there were so many new things to
Sarah Cheng
 ·  · 2m read
 · 
TL;DR: The EA Opportunity Board is back up and running! Check it out here, and subscribe to the bi-weekly newsletter here. It’s now owned by the CEA Online Team. EA Opportunities is a project aimed at helping people find part-time and volunteer opportunities to build skills or contribute to impactful work. Their core products are the Opportunity Board and the associated bi-weekly newsletter, plus related promos across social media and Slack automations. It was started and run by students and young professionals for a long time, and has had multiple iterations over the years. The project has been on pause for most of 2024 and the student who was running it no longer has capacity, so the CEA Online Team is taking it over to ensure that it continues to operate. I want to say a huge thank you to everyone who has run this project over the three years that it’s been operating, including Sabrina C, Emma W, @michel, @Jacob Graber, and Varun. From talking with some of them and reading through their docs, I can tell that it means a lot to them, and they have some grand visions for how the project could grow in the future. I’m happy that we are in a position to take on this project on short notice and keep it afloat, and I’m excited for either our team or someone else to push it further in the future. Our plans We plan to spend some time evaluating the project in early 2025. We have some evidence that it has helped people find impactful opportunities and stay motivated to do good, but we do not yet have a clear sense of the cost-effectiveness of running it[1]. We are optimistic enough about it that we will at least keep it running through the end of 2025, but we are not currently committing to owning it in the longer term. The Online Team runs various other projects, such as this Forum, the EA Newsletter, and effectivealtruism.org. I think the likeliest outcome is for us to prioritize our current projects (which all reach a larger audience) over EA Opportunities, which