epistemic status: I am fairly confident that the overall point is underrated right now, but am writing quickly and think it's reasonably likely the comments will identify a factual error somewhere in the post.
Risk seems unusually elevated right now of a serious nuclear incident, as a result of Russia badly losing the war in Ukraine. Various markets put the risk at about 5-10%, and various forecasters seem to estimate something similar. The general consensus is that Russia, if they used a nuclear weapon, would probably deploy a tactical nuclear weapon on the battlefield in Ukraine, probably in a way with a small number of direct casualties but profoundly destabilizing effects.
A lot of effective altruists have made plans to leave major cities if Russia uses a nuclear weapon, at least until it becomes clear whether the situation is destabilizing. I think if that happens we'll be in a scary situation, but based on how we as a community collectively reacted to Covid, I predict an overreaction -- that is, I predict that if there's a nuclear use in Ukraine, EAs will incur more costs in avoiding the risk of dying in a nuclear war than the actual expected costs of dying in a nuclear war, more costs than necessary to reduce the risks of dying in a nuclear war, and more costs than we'll endorse in hindsight.
With respect to Covid, I am pretty sure the EA community and related communities incurred more costs in avoiding the risk of dying of Covid than was warranted. In my own social circles, I don't know anyone who died of Covid, but I know of a healthy person in their 20s or 30s who died of failing to seek medical attention because they were scared of Covid. A lot of people incurred hits to their productivity and happiness that were quite large.
This is especially true for people doing EA work they consider directly important: being 10% less impactful at an EA direct work job has a cost measured in many human or animal or future-digital-mind lives, and I think few people explicitly calculated how that cost measured up against the benefit of reduced risk of Covid.
If Russia uses a nuclear weapon in Ukraine, here is what I expect to happen: a lot of people will be terrified (correctly assessing this as a significant change in the equilibrium around nuclear weapon use which makes a further nuclear exchange much more likely.) Many people will flee major cities in the US and Europe. They will spend a lot of money, take a large productivity hit from being somewhere with worse living conditions and worse internet, and spend a ton of their time obsessively monitoring the nuclear situation. A bunch of very talented ops people will work incredibly hard to get reliable fast internet in remote parts of Northern California or northern Britain. There won't be much EAs not already in nuclear policy and national security can do, but there'll be a lot of discussion and a lot of people trying to get up to speed on the situation/feeling a lot of need to know what's going on constantly. The stuff we do is important, and much less of it will get done. It will take a long time for it to become obvious if the situation is stable, but eventually people will mostly go back to cities (possibly leaving again if there are further destabilizing events).
The recent Samotsvety forecast estimates that a person staying in London will lose 3-100 hours to nuclear risk in expectation (edit: which goes up by a factor of 6 in the case of actual tactical nuke use in Ukraine.) I think it is really easy for that person to waste more than 3-100 hours by being panicked, and possible to waste more than 20 - 600 hours on extreme response measures. And that's the life-hour costs of never fleeing; you also have the option of fleeing at a later point if there are further worrying developments, and it's probably a mistake to only model 'flee as soon as there's tactical nuke use' against 'stay no matter what' and not against 'flee slightly later'.
Some degree of costs incurred is quite reasonable. I think that during the Cuban Missile Crisis we were quite close to nuclear war, and probably reasonable people at the time would have (in addition to trying to prevent such a war) tried to leave major cities. I think it might make sense for people whose work is already remote, and who have a non-major-city place to stay, to leave. I think that the fact EAs are weird, and take our beliefs more seriously than most people, and take concrete actions based on expected-value arguments, is a strength. Certainly at some threshold of risk I'll leave with my family. But my overall expectation is that we'll end up causing more disruptions than are justified, at substantial expense to other work which is also about securing a good human future.
Some specific bad tradeoffs that seem easy to avoid:
- going to a very remote area dramatically reduces the risk of being hit in a nuclear exchange, but makes it incredibly inconvenient to work normally/collaborate with others/etc. DC, New York, and San Francisco are among the highest-likelihood-of-being-hit-in-a-full-nuclear-exchange cities in the US, and London obviously the highest in the UK, but if you go from those cities to nearby populous cities you probably get most of the benefit and incur lower costs in productivity/etc. In my opinion Americans who don't live in those cities and don't live next to a base from which we launch our own nukes shouldn't bother leaving (I don't know enough about continental Europe to have opinions there.) For San Franciscans, going to Santa Rosa is probably nearly as good as going to the middle of the mountains, or going to Eugene, and it's much less costly.
- we can avoid socially pressuring people towards acting more strongly than they endorse: if you don't care about this, I think you should feel licensed to not care about it and go about your life normally.
- if people do want to leave, I think they should keep in mind the productivity costs of 1) bad internet 2) isolation, and strongly prioritize going somewhere non-isolated with good internet access.
- keep in mind that while expected value sometimes implies reacting strongly to things with only a small chance of being really bad, you have to actually do the expected value calculations -- and your wellbeing, productivity, and impact on the world should be an input.
Disagree. I think it is best to behave "reasonably" until proven otherwise. Things like (1) sterilizing mail and delivered groceries, or (2) trying for total bubbling when everyone you know is already being highly, highly careful, always seemed like covid theater and it disturbed me that EAs didn't realize that with common sense. It also led to some, I think, social injustices, like suppressing your housemates right to see family or see their live-out romantic partners.
Yes I want EAs to be faster than experts at taking "costly preventative acts" but... the EV cost should be well in the positive. And with EA productivity worth a lot, that is a high bar. Firstly, I think, be very sure the acts are effective, to counterweight the lack of sureness of the other variable (likelihood of harm from the problem you are trying to avoid). If you cant be sure effectiveness with data yet, check your intuition (which EAs mostly didn't do I think, I mean people mostly chose sterilizing things over HEPA filters, wtf, and mostly didn't downgrade risk of meeting with "quarantined people" as the quarantined person neared the end of their 2 weeks still showing no symptoms).
[EDIT: RE: determining effectiveness of relocation against nuclear risk, I asked a question in comment below]
Transfer your argument that EAs should be "extremely paranoid at the start" to nuclear risk, and you get the advice that people should basically move, idk, this month? Or start renting a house now that remains empty, but is ready for them in case of rapid departure? Possibly even spend their work hours arranging any of this? These mostly seem too big for where we are now IMO (the house rental seems decentish because it could calm people's anxieties knowing they did something, and a house in the boonies is cheap).