I have not researched longtermism deeply. However, what I have found out so far leaves me puzzled and skeptical. As I currently see it, you can divide what longtermism cares about into two categories:
1) Existential risk.
2) Common sense long-term priorities, such as:
- economic growth
- environmentalism
- scientific and technological progress
- social and moral progress
Existential risk isn’t a new idea (relative to longtermism) and economic growth, environmentalism, and societal progress aren’t new ideas either. Suppose I already care a lot about low-probability existential catastrophes and I already buy into common sense ideas about sustainability, growth, and progress. Does longtermism have anything new to tell me?
Maybe? This depends on what you think about the probability that intelligent life re-evolves on earth (it seems likely to me) and how good you feel about the next intelligent species on earth vs humans.
IMO, most x-risk from AI probably doesn't come from literal human extinction but instead AI s... (read more)