Hi!
I'm Tobias Baumann, co-founder of the Center for Reducing Suffering, a new longtermist research organisation focused on figuring out how we can best reduce severe suffering, taking into account all sentient beings. Ask me anything!
A little bit about me:
I’m interested in a broad range of research topics related to cause prioritisation from a suffering-focused perspective. I’ve written about risk factors for s-risks, different types of s-risks, as well as crucial questions on longtermism and artificial intelligence. My most-upvoted EA Forum post (together with David Althaus from the Center on Long-Term Risk) examines how we can best reduce long-term risks from malevolent actors. I’ve also explored various other topics, including space governance, electoral reform, improving our political system, and political representation of future generations. Most recently, I’ve been thinking about patient philanthropy and the optimal timing of efforts to reduce suffering.
Although I'm most interested in questions related to those areas, feel free to ask me anything. Apologies in advance if there are any questions which, for any of many possible reasons, I’m not able to respond to.
What are some common misconceptions about the suffering-focused world-view within the EA community?
[Warning: potentially disturbing discussion of suicide and extreme suffering.]
I agree with many of the points made by Anthony. It is important to control for these other confounding factors, and to make clear in this thought experiment that the person in question cannot reduce more suffering for others, and that the suicide would cause less suffering in expectation (which is plausibly false in the real world, also considering the potential for suicide attempts to go horribly wrong, Humphry, 1991, “Bizarre ways to die”). (So to be clear, and a... (read more)