Most people in EA are afraid of extinction risk. If the expected value of human is net positive, then we really should prevent human's extinction. There are a lot of uncertainties, such as:AI, the importance of s-risk, the evolution of human... I think human's future is like chaos . Can we estimate human's future is net-positive or net-negative objectively? or we can only rely on our moral intuition?
Thanks for your sharing very much, when I read this, I feel it's a little unnatural, weird. If we discuss long-term future, humans might face aliens, another civilization,superintelligence... Humans' personality may change by evolution. I feel like the prediction is too subjective.