[This is a personal and emotional post about my feelings about EA. Probably not for everyone! Could cause sadness, scrupulosity concerns, and guilt.]
I think it's true that 2x the suffering is 2x as bad, and it would be emotionally accurate for it to make me 2x as sad, i.e. if it did then my emotions would better reflect reality. But I worry that a lot of people get tangled up with the distinction between emotional accuracy and instrumental value of emotions. They're often correlated; it's useful to be more scared of dying in a car crash than dying by lion attack. And emotions can be motivating, so having emotions that reflect reality can cause greater effectiveness.
But this gets tricky with EA.
I believe the moral importance of suffering increases linearly as suffering increases, but there are non-linear marginal returns to having emotions that reflect that. Just as there are instrumentally rational techniques that require irrationality, there are instrumentally useful emotions that require emotional inaccuracy. I don't know what emotions are most instrumentally useful for improving the world, but they're probably not going to be the ones that correspond linearly to the reality of the amounts of suffering in the world.
I only know from the inside my own seemingly morally relevant experiences, my subjective feelings of joy and serenity and curiosity and sorrow and anger and apathy. In practice, I can only at my most emotionally expansive moments hold in my mind all the morally relevant experiences I think I have in my median hour. So I can maybe comprehend less than 1/140,000 of the morally important things I've personally felt*. I don't know if I'm an outlier in that regard, but I'm pretty certain that I am completely incapable of emotionally understanding a fraction of the value of a life (even when I have the huge advantage of having felt the life from the inside). And that's not changing any time soon.
Yet, it somehow seems to be true that billions or trillions of beings are having morally relevant experiences right now, and had them in the past, and (many times) more could have morally relevant experiences in the future. My emotions are not well-equipped to deal with this; they can't really understand numbers bigger than a three or experiences longer than an hour (true story) (I may be unusually incompetent in this regard, but probably not by many orders of magnitude).
The cost to save a human life might be a few thousand dollars. The value of each sentient life is incomprehensibly vast**. EA is a "bargain" because so many lives are so drastically undervalued by others. And resources are scarce; even if some lives weren't undervalued relative to others, we still couldn't give everyone what their value alone would compel us to, if we had more.
Having to triage is desperately sad. The fact that we can't help everyone is terrible and tragic; we should never stop fighting to be able to help everyone more. I worry about losing sight of this, and denying the emotional correctness of feeling an ocean of sorrow for the suffering around us. To feel it is impossible, and would be debilitating.
I can't emotionally comprehend all of what I'm doing and not doing, and wouldn't choose to if I could. That's why, for me, effective altruism is a leap of faith. I'm learning to live a life I can't emotionally fully understand, and I think that's okay. But I think it's good to remind myself, from time to time, what I'm missing by necessity.
*Assuming I have no morally relevant experiences while sleeping, which seems untrue.
**With the exception of borderline-sentient or very short-lived beings that have lives with little (but nonzero!) moral value.