Effective altruism is based on the core belief that all people count equally. We unequivocally condemn Nick Bostrom’s recklessly flawed and reprehensible words. We reject this unacceptable racist language, and the callous discussion of ideas that can and have harmed Black people. It is fundamentally inconsistent with our mission of building an inclusive and welcoming community.
— The Centre for Effective Altruism
I don't think this goes through. Let's just talk about the hypothetical of humanity's evolutionary ancestors still being around.
Unless you assign equal moral weight to an ape than you do to a human, this means that you will almost certainly assign lower moral weight to humans or nearby species earlier in our evolutionary tree, primarily on the basis of genetic differences, since there isn't even any clean line to draw between humans and our evolutionary ancestors.
Similarly, I don't see how you can be confident that your moral concern in the present day is independent of exactly that genetic variation in the population. That genetic variation is exactly the same that over time made you care more about humans than other animals, amplified by many rounds of selection, and as such, it would be very surprising if there was absolutely no difference in moral patienthood among the present human population.
Again, I expect that variance to be quite small, since genetic variance in the human population is much smaller than the variance between different species, and also for that variance to really not align very well with classical racist tropes, but the nature of the variance is ultimately the same.
And the last part of the sentence that I quoted seems also not very compatible with this. Digital people might have hugely varying levels of capacity for suffering and happiness and other things we care about, including different EMs. I indeed hope we create beings with much greater capacity for happiness than us, and would consider that among one of the moral priorities of our time.