An existential risk is the risk of an existential catastrophe, i.e. one that threatens the destruction of humanity’s longterm potential (Bostrom 2012; Ord 2020a). Existential risks include natural risks such as those posed by asteroids or supervolcanoes as well as anthropogenic risks like mishaps resulting from synthetic biology or artificial intelligence.

A number of authors have argued that existential risks are especially important because the long-run future of humanity matters a great deal (Beckstead 2013; Bostrom 2013; Greaves & MacAskill 2019; Ord 2020a). Many believe that there is no intrinsic moral difference between the importance of a life today and one in a hundred years. However, there may be many more people in the future than there are now. They argue, therefore, that it is overwhelmingly important to preserve that potential, even if the risks to humanity are small.

One objection to this argument is that people have a special responsibility to other people currently alive that they do not have to people who have not yet been born (Roberts 2009). Another objection is that, although it would in principle be important to manage, the risks are currently so unlikely and poorly understood that existential risk reduction is less cost-effective than work on other promising areas....

(Read More)