Phil Torres has an article criticizing Longtermism. I'm posting here in the spirit of learning from serious criticism. I'd love to hear others' reactions: https://www.currentaffairs.org/2021/07/the-dangerous-ideas-of-longtermism-and-existential-risk
Spelling out his biggest concern, he says "Even more chilling is that many people in the community believe that their mission to “protect” and “preserve” humanity’s 'longterm potential' is so important that they have little tolerance for dissenters." He asserts that "numerous people have come forward, both publicly and privately, over the past few years with stories of being intimidated, silenced, or 'canceled.'" This doesn't match my experience. I find the EA community loves debate and questioning assumptions. Have others had this experience? Are there things we could do to improve as a community?
Another critique Torres makes comes down to Longermism being intuitively bad. I don't agree with that, but I bet it is a convincing argument to many outside of EA. For a large number of people, Longtermism can sound crazy. Maybe this has implications for communications strategy. Torres gives examples of Longtermists minimizing global warming. A better framing for Longtermists to use could be something like "global warming is bad, but these other causes could be worse and are more neglected." I think many Longtermists, including Rob Wiblin of 80,000 hours, already employ this framing. What do others think?
Here is the passage where Torres casts Longtermism as intuitively bad:
If this sounds appalling, it’s because it is appalling. By reducing morality to an abstract numbers game, and by declaring that what’s most important is fulfilling “our potential” by becoming simulated posthumans among the stars, longtermists not only trivialize past atrocities like WWII (and the Holocaust) but give themselves a “moral excuse” to dismiss or minimize comparable atrocities in the future.
Posts that criticize Longtermism, posted directly by the writer, or in this case deliberately brought in, aren’t just bad in content by a huge margin, but also seem to be intellectually dishonest.
Frankly, it suggests criticism of this topic attracts a malign personality.
I am worried that this apparent pattern will send a signal that shades internal and external behavior among EAs.
This is because I know some EAs who are senior but are skeptical of Longtermism.
I am not as skeptical, because it seems natural to value (huge?) future populations, and it seems the pandemic has given overwhelming evidence that x-risk and related projects are hugely underserved.
My concern is that Longtermism benefits from many rhetorical issues that many do not perceive and this is unhealthy.
I don’t really want to lay out the case, as I am worried I will join the others in creating a stinker of a post or that I am indulging in some personal streak of narcissism.
However, I am worried this pattern of really bad content doesn't help. I do not know what to do.