Existential risk

Lizka (+860/-686)
Pablo (+71)
Leo
Leo (+56)
Pablo
Pablo (+1735)
Pablo (+109/-357)
Leo (+779/-859)
Leo
Leo (+29/-51)

An existential risk is thea risk of an existential catastrophe, i.e. one that threatens the destruction of humanity’s longterm potential.the long-term potential of life.[1] An existential risk could threaten the [2]extinction of humans Existential risks include (and other sentient beings), or it could threaten some other unrecoverable naturalcollapse or permanent failure to achieve a potential good state. Natural risks such as those posed by asteroids or supervolcanoes could be existential risks, as well ascould anthropogenic (human-caused) risks like mishaps resultingaccidents from synthetic biology or unalignedartificial intelligence. 

A numberEstimating the probability of existential risk from different factors is difficult, but there are some estimates.[1] 

Some view reducing existential risks as a key moral priority, for a variety of reasons.[2] Some people simply view the current estimates of existential risk as unacceptably high. Other authors have arguedargue that existential risks are especially important because the long-run future of humanity matters a great deal.[1][3][4][5] Many believe that there is no intrinsic moral difference between the importance of a life today and one in a hundred years. However, there may be many more people in the future than there are now. They argue, therefore, that it is overwhelmingly importantGiven these assumptions, existential risks threaten not only the beings alive right now, but also the enormous number of lives yet to preserve that potential, even if the risks to humanity are small.

be lived. One objection to this argument is that people have a special responsibility to other people currently alive that they do not have to people who have not yet been born.[6]4] Another objection is that, although it would in principle be important to manage, the risks are currently so unlikely and poorly understood that existential risk reduction is less cost-effective than work on other promising areas.

Recommendations

In The Precipice: Existential Risk and the Future of Humanity, Toby Ord offers several policy and research recommendations for handling existential risks:[7]5]

  1. ^

    Bostrom, Nick (2012) Frequently asked questions, Existential Risk: Threats to Humanity’s Future (updated 2013).

  2. ^

    Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing.

  3. ^

    Todd, Benjamin (2017) The case for reducing existential risks, 80,000 Hours website. (Updated June 2022.)

  4. ^

    Beckstead, Nick (2013) On the Overwhelming Importance of Shaping the Far Future, PhD thesis, Rutgers University.

  5. ^

    Bostrom, Nick (2013) Existential risk prevention as global priority, Global Policy, vol. 4, pp. 15–31.

  6. ^

    Greaves, Hilary & William Macaskill (2019) The case for strong longtermism, GPI working paper No. 7-2019, Working paper Global Priorities Institute, Oxford University.

  7. ^

    Roberts, M. A. (2009) The nonidentity problem, Stanford Encyclopedia of Philosophy, July 21 (updated 1 December 2020).

  8. ^

    Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing, pp. 280–281.

Recommendations

In The Precipice: Existential Risk and the Future of Humanity, Toby Ord offers several policy and research recommendations for handling existential risks:[7]

  • Explore options for new international institutions aimed at reducing existential risk, both incremental and revolutionary.
  • Investigate possibilities for making the deliberate or reckless imposition of human extinction risk an international crime.
  • Investigate possibilities for bringing the representation of future generations into national and international democratic institutions.
  • Each major world power should have an appointed senior government position responsible for registering and responding to existential risks that can be realistically foreseen in the next 20 years.
  • Find the major existential risk factors and security factors—both in terms of absolute size and in the cost-effectiveness of marginal changes.
  • Target efforts at reducing the likelihood of military conflicts between the US, Russia and China.
  • Improve horizon-scanning for unforeseen and emerging risks.
  • Investigate food substitutes in case of extreme and lasting reduction in the world’s ability to supply food.
  • Develop better theoretical and practical tools for assessing risks with extremely high stakes that are either unprecedented or thought to have extremely low probability.
  • Improve our understanding of the chance civilization will recover after a global collapse, what might prevent this, and how to improve the odds.
  • Develop our thinking about grand strategy for humanity.
  • Develop our understanding of the ethics of existential risk and valuing the longterm future.
  1. ^

    Bostrom, Nick (2012) Frequently asked questions, Existential Risk: Threats to Humanity’s Future (updated 2013).

  2. ^

    Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing.

  3. ^

    Beckstead, Nick (2013) On the Overwhelming Importance of Shaping the Far Future, PhD thesis, Rutgers University.

  4. ^

    Bostrom, Nick (2013) Existential risk prevention as global priority, Global Policy, vol. 4, pp. 15–31.

  5. ^

    Greaves, Hilary & William Macaskill (2019) The case for strong longtermism, GPI working paper No. 7-2019, Working paper Global Priorities Institute, Oxford University.

  6. ^

    Roberts, M. A. (2009) The nonidentity problem, Stanford Encyclopedia of Philosophy, July 21 (updated 1 December 2020).

  7. ^

    Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing, pp. 280–281.

Karnofsky, Holden (2014)Bostrom, Nick (2013) The moral value of the far futureExistential risk prevention as global priority, Open PhilanthropyGlobal Policy, July 3.vol. 4, pp. 15–31.

Ord, Toby (2020a)(2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing.

Ord, Toby (2020b)(2020) Existential risks to humanity in Pedro Conceição (ed.) The 2020 Human Development Report: The Next Frontier: Human Development and the Anthropocene, New York: United Nations Development Programme, pp. 106–111.

Tomasik, Brian (2019) Risks of astronomical future suffering, Ceter on Long-Term Risk, July 2.
An article exploring ways in which a future full of Earth-originating life might be bad.

Whittlestone, Jess (2017) The long-term future, Effective Altruism, November 16.

An existential risk is the risk of an existential catastrophe, i.e. one that threatens the destruction of humanity’s longterm potential (Bostrom 2012; Ord 2020a).potential.[1][2] Existential risks include natural risks such as those posed by asteroids or supervolcanoes as well as anthropogenic risks like mishaps resulting from synthetic biology or artificial intelligence.

A number of authors have argued that existential risks are especially important because the long-run future of humanity matters a great deal (Beckstead 2013; Bostrom 2013; Greaves & MacAskill 2019; Ord 2020a).deal.[1][3][4][5] Many believe that there is no intrinsic moral difference between the importance of a life today and one in a hundred years. However, there may be many more people in the future than there are now. They argue, therefore, that it is overwhelmingly important to preserve that potential, even if the risks to humanity are small.

One objection to this argument is that people have a special responsibility to other people currently alive that they do not have to people who have not yet been born (Roberts 2009).born.[6] Another objection is that, although it would in principle be important to manage, the risks are currently so unlikely and poorly understood that existential risk reduction is less cost-effective than work on other promising areas.

BibliographyFurther reading

Beckstead, Nick (2013) On the Overwhelming Importance of Shaping the Far Future, PhD thesis, Rutgers University.

Bostrom, Nick (2012) Frequently asked questions, Existential Risk: Threats to Humanity’s Future (updated 2013).
This FAQ introduces readers to existential risk.

Bostrom, Nick (2013) Existential risk prevention as global priority, Global Policy, vol. 4, pp. 15–31.
An academic paper making the case for existential risk work.

Greaves, Hilary & William Macaskill (2019) The case for strong longtermism, GPI working paper No. 7-2019, Working paper Global Priorities Institute, Oxford University.

Roberts, M. A. (2009) The nonidentity problem, Stanford Encyclopedia of Philosophy, July 21 (updated 1 December 2020).

  1. ^

    Bostrom, Nick (2012) Frequently asked questions, Existential Risk: Threats to Humanity’s Future (updated 2013).

  2. ^

    Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing.

  3. ^

    Beckstead, Nick (2013) On the Overwhelming Importance of Shaping the Far Future, PhD thesis, Rutgers University.

  4. ^

    Bostrom, Nick (2013) Existential risk prevention as global priority, Global Policy, vol. 4, pp. 15–31.

  5. ^

    Greaves, Hilary & William Macaskill (2019) The case for strong longtermism, GPI working paper No. 7-2019, Working paper Global Priorities Institute, Oxford University.

  6. ^

    Roberts, M. A. (2009) The nonidentity problem, Stanford Encyclopedia of Philosophy, July 21 (updated 1 December 2020).

Load more (10/47)