I think it'd be bizarre if the war in Ukraine didn't shift out funding priorities in some way. WW3 now appears likelier, and likelier to happen sooner. Presumably, this should shift more of a focus towards A. preventing it and B. minimizing its harm.
The episode was fun to listen to! It was also my introduction to the podcast, which I'll be listening to more of.
There were some good takes, but I was not impressed by the following section of Samo's:
So, as US hegemony recedes globally, I think there will be more wars. Not necessarily in Europe, though I predict there will be wars in the following parts of Europe:I think the Mediterranean, there will be one, there will be some large wars fought in the Mediterranean as there will be a new balance of power between European countries that aspire to be Mediterranean powers, especially France, and to a much lesser extent, Spain and Italy versus Eastern rising powers, such as Turkey.
So, as US hegemony recedes globally, I think there will be more wars. Not necessarily in Europe, though I predict there will be wars in the following parts of Europe:
I think the Mediterranean, there will be one, there will be some large wars fought in the Mediterranean as there will be a new balance of power between European countries that aspire to be Mediterranean powers, especially France, and to a much lesser extent, Spain and Italy versus Eastern rising powers, such as Turkey.
He's predicting that members of NATO will fight? What -- in the next 50 years? What would a metaculus question asking about that poll? Less than 5% I'd guess.
... And I think there'll be wars in all of the former Soviet spaces because the balance of power is going to be more and more unfavorable to the American side as the US slowly and inevitably has to withdraw from the world because it's relative economic and political weight is smaller.
Wait, so there'll be wars because the US is relatively weaker compared to ... who excactly? China? Because I certainly doubt it'd be Russia who'll rise from the ashes riding glorious Khrushchev levels of growth.
So I'm not talking about an absolute decline. It's just that the very fact that China has risen means that even if Russia continues to grow weaker in the future, possibly due to sanctions, possibly due to political instability, et cetera, et cetera, China can still start projecting power all over the place.
That makes it seem like China will be the driving force behind wars in "all the Soviet spaces"? And Russia will be happy with that? That seems wild.
Thank you for writing that Ed!
Hi, I'm afraid I don't have any terribly helpful advice. My family and other people I know are having the same struggle.
The best I can come up with is that the metaculus community gives a 20% chance of WW3 breaking out before 2050. That's definitely way too high, but I assume that most of the probability mass is distributed somewhat evenly over time. The same community also places a 2% chance on a NATO nation invoking article 5 in the next year, which would presumably not equate to nuclear war in the same circumstance.
However, I think recent events should make the EA community ask themselves and each other "what should we do if these risks increase?" At what probability of WW3 do we start shifting EA resources towards work on prevention / recovery? At what chance do we as a community start moving to safer locations?
My family is trying to formulate a plan that's something like "if the probability of WW3 in the next year surpasses 33%, then we're going to temporarily relocate to another country until the tensions subside peacefully."Obviously that's not possible for many, but talking about it has settled our nerves a bit.
I hope you find some peace.
Thank you! I edited the post to reflect your updated text.
Would you be able to provide a plainer language summary of the papers conclusions or arguments? I think I'm interested in the topics discussed in the paper. But it’s unclear me what the arguments actually are, so I’m inclined to disengage.
Take this sentence, which seems important:
“We argue that while the moral uncertainty approach cannot vindicate an exceptionless public justification principle, it gives us reason to adopt public justification as a pro tanto institutional commitment.”
I do not understand this and so I do not see how this is a valuable addition to the critical topic of moral uncertainty.
That crisis was resolved when President Dwight Eisenhower sent the National Guard to Arkansas to integrate Central High School.
Small note: A division of the US military was called in response to Faubus ordering the Arkansas National Guard to block integration. I think the details show how the situation was one of the most precarious Federal-State conflicts since the civil war, and I think that'd influence how I would respond to the question.
A related thought:
Some humans are much less sensitive to physical pain.
1. Could an observer correctly differentiate between those with normal and abnormally low sensitivity to pain?
2. For humans who're relatively insensitive to pain, but still exhibit the appropriate response to harm signals (assuming they exist), would analgesics diminish the "appropriateness" of their response to a harm signal?
Edit: This comment now makes less sense, given that Abby has revised the language of her comment.
I strongly endorse what you say in your last paragraph:
Please provide evidence that "dissonance in the brain" as measured by a "Consonance Dissonance Noise Signature" is associated with suffering? ... I'm willing to change my skepticism about this theory if you have this evidence.
However, I'd like to push back on the tone of your reply. If you're sorry for posting a negative non-constructive comment, why not try to be a bit more constructive? Why not say something like "I am deeply skeptical of this theory and do not at this moment think it's worth EAs spending time on. [insert reasons]. I would be willing to change my view if there was evidence."
Apologies for being pedantic, but I think it's worth the effort to try and keep the conversation on the forum as constructive as possible!
I found that this episode increased my faith in the EA community a little bit. One of my caricatures of other EAs when I first found the community was "it's good these people exist but they'd make terrible friends because they're so impartial they'd leave me in a rut to squeeze the epsilon out of an EV that bears a resemblance to a probability."
It was a bit of an (irrational?) fear that EAs and EA orgs were constituted by hyper-utilitarians that'd sacrifice their friends / employees if the felicific calculus didn't add up.
But most people I've met in (at least my section of) the EA community have been unusually kind and compassionate people. Some I am very glad to call my friends. And I don't think they would jettison me if I gained a debilitating illness, which makes me more motivated to do good.
Note: Of course there's instrumental utilitarian reasons to act in a manner more consistent with commonsense decency.
This made me want to hear more narratives and cases like this that give a helpful but honest report of what someone's experience of mental health was like. I've thus far avoided the extant literature out of a fear that reading / listening to cases of people experience severe mental illness would degrade my own well-being.
In particular, I'd like to hear about other people in the EA community and hear more stories (there've kind of been a few on the forum) who weren't as lucky as Howie.