Most people in EA are afraid of extinction risk. If the expected value of human is net positive, then we really should prevent human's extinction. There are a lot of uncertainties, such as:AI, the importance of s-risk, the evolution of human... I think human's future is like chaos . Can we estimate human's future is net-positive or net-negative objectively? or we can only rely on our moral intuition?
I think this is a good and important question. I also agree that humanity's predicament in 500 years is wildly unpredictable.
But there are some considerations that can guide our guess:
If you begin totally unsure whether the future is good or bad in expectation, then considerations like these might break the symmetry (while remaining entirely open to the possibility that the future is bad).
This post might also be useful; it recomplicates things by giving some considerations on the other side.
Thanks for your sharing very much, when I read this, I feel it's a little unnatural, weird. If we discuss long-term future, humans might face aliens, another civilization,superintelligence... Humans' personality may change by evolution. I feel like the prediction is too subjective.