I'm dissatisfied with my explanation of why there is not more attention from EAs and EA funders on nuclear safety and security, especially relative to e.g. AI safety and biosecurity. This has come up a lot recently, especially after the release of Oppenheimer. I'm worried I'm not capturing the current state of affairs accurately and consequently not facilitating fully contextualized dialogue.
What is your best short explanation?
(To be clear, I know many EAs and EA funders are working on nuclear safety and security, so this is more so a question of resource allocation, rather than inclusion in the broader EA cause portfolio.)
This feels too confident. A nuclear war into a supervolcano is just really unlikely. Plus if there were 1000 people then there would be so much human canned goods left over - just go to a major city and sit in a supermarket.
If a major city can support a million people for 3 days on its reserves it can support a 1000 people for 30 years.
Again, I'm not saying that I think it doesn't matter, but I think my answers are good reasons why it's less than AI