I have an intuition that eliminating the severe suffering of say, 1 million people, might be more important than creating hundreds of trillions of happy people who would otherwise never exist. It's not that I think there is no value in creating new happy people. It's just that I think (a) the value of creating new happy people is qualitatively different than that of reducing severe suffering, and (b) sometimes, when two things are of qualitatively different value, no amount of 1 can add up to a certain amount of the other.
For example, consider two "intelligence machines" with qualitatively different kinds of intelligences. One does complex abstract reasoning and the other counts. I think it would be the case that no matter how much better you made the counting machine at counting, it would never surpass the intelligence of the abstract machine. Even though the counting machine gest more intelligent with each improvement, it never matches the intelligence of the abstract machine since the latter is of a qualitatively different and superior nature. Similarly, I value both deep romantic love and eating french fries, but I wouldn't trade in a deep and fulfilling romance for any amount of french fries (even if I never got sick of fries). And I value human happiness and ant happiness, but wouldn't trade in a million happy humans for any amount of happy ants.
In the same vein, I suspect that the value of reducing the severe suffering of millions is qualitatively different from and superior to the value of creating new happy people such that the latter can never match the former.
Do you think there's anything to this intuition?