Phil Torres has an article criticizing Longtermism. I'm posting here in the spirit of learning from serious criticism. I'd love to hear others' reactions: https://www.currentaffairs.org/2021/07/the-dangerous-ideas-of-longtermism-and-existential-risk
Spelling out his biggest concern, he says "Even more chilling is that many people in the community believe that their mission to “protect” and “preserve” humanity’s 'longterm potential' is so important that they have little tolerance for dissenters." He asserts that "numerous people have come forward, both publicly and privately, over the past few years with stories of being intimidated, silenced, or 'canceled.'" This doesn't match my experience. I find the EA community loves debate and questioning assumptions. Have others had this experience? Are there things we could do to improve as a community?
Another critique Torres makes comes down to Longermism being intuitively bad. I don't agree with that, but I bet it is a convincing argument to many outside of EA. For a large number of people, Longtermism can sound crazy. Maybe this has implications for communications strategy. Torres gives examples of Longtermists minimizing global warming. A better framing for Longtermists to use could be something like "global warming is bad, but these other causes could be worse and are more neglected." I think many Longtermists, including Rob Wiblin of 80,000 hours, already employ this framing. What do others think?
Here is the passage where Torres casts Longtermism as intuitively bad:
If this sounds appalling, it’s because it is appalling. By reducing morality to an abstract numbers game, and by declaring that what’s most important is fulfilling “our potential” by becoming simulated posthumans among the stars, longtermists not only trivialize past atrocities like WWII (and the Holocaust) but give themselves a “moral excuse” to dismiss or minimize comparable atrocities in the future.
Phil Torres's tendency to misrepresent things aside, I think we need to take Phil Torres's article as an example of the severe criticism that longtermism is liable to attract, as currently framed, and reflect on how we can present it differently. It's not hard to read this sentence on the first page of (EDIT: the original version of) "The Case for Strong Longtermism":
and conclude that, as Phil Torres does, longtermism means that we can justify causing present-day atrocities for a slight, let's say 0.1% increase in the subjective probability of a valuable long-term future. Thinking rationally, atrocities do not improve the long-term future, and longtermists care a lot about stability. But with the framing given by "The Case for Strong Longtermism", there is a small risk that is higher than it needs to be that future longtermists can be persuaded that atrocities would be justified, especially when subjective probabilities are so subjective. How can we reframe or redefine longtermism so that: firstly, we reduce the risk of longtermism being used to justify atrocities, and secondly (and I think more pressingly), reduce the risk that longtermism is generally seen as something that justifies atrocities?
It seems like this framing of longtermism is a far greater reputational risk to EA than, say, how 80,000 Hours over-emphasized earning to give, which is something that 80,000 Hours apparently seriously regrets.
I think "The Case for Strong Longtermism" should be revised to not say things like "we can in the first instance often simply ignore all the effects contained in the first 100 (or even 1000) years", without detailing significant caveats. It's just a working paper—shouldn't be too hard for Greaves and MacAskill to revise.(EDIT: this has already happened, as Aleks_K has pointed out below.) If there are many more articles like Phil Torres's here written in other media in the near future, I would be very hesitant about using the term "longtermism". Phil Torres is someone who is sympathetic to effective altruism and to existential risk reduction, someone who believes "you ought to care equally about people no matter when they exist"; now imagine if the article were written by someone who isn't as sympathetic to EA.(This really shouldn't affect my argument, but I do generally agree with longtermism.)
FYI, this has already happened. The version you are linking to is outdated, and the updated version here does no longer contain this statement.