We are discussing the debate statement: "On the margin[1], it is better to work on reducing the chance of our[2] extinction than increasing the value of futures where we survive[3]". You can find more information in this post.
When you vote and comment on the debate week banner, your comment will also appear here, along with a note indicating your initial vote, and your most recent vote (if your opinion has changed).
However, you can also comment here any time throughout the week. Use this thread to respond to other people's arguments, and develop your own.
If there are a lot of comments - consider sorting by “New” and interacting with posts that haven’t been voted or commented on yet.
Also - perhaps don’t vote karma below zero for low effort submissions, we don’t want to discourage low effort takes on the banner.
- ^
‘on the margin’ = think about where we would get the most value out of directing the next indifferent talented person, or indifferent funder.
- ^
‘our’ and 'we' = earth-originating intelligent life (i.e. we aren’t just talking about humans because most of the value in expected futures is probably in worlds where digital minds matter morally and are flourishing)
- ^
Through means other than extinction risk reduction.
I think "increasing value of futures where we survive" is broad enough that plenty of non-EA stuff like just foreign aid or governance reform stuff generally would count and X-Risk stuff is very specific and niche.
There is a misunderstanding: "Increasing value of futures where we survive" is an X-risks reduction intervention.
See the comment by MacAskill https://forum.effectivealtruism.org/posts/TeBBvwQH7KFwLT7w5/william_macaskill-s-shortform?commentId=jbyvG8sHfeZzMqusJ which clarifies that the debate is between Extinction-Risks vs Alignment-Risks (AKA increasing the value of future) which both are X-risks. The debate is not between X-risks and Alignment-Risks.
One of the most impactful way to "increasing value of futures where we survive" is to work on AI governance and technical AI alignment.