Estimation of existential risk

I've removed the following from the human extinction entry:

One way to estimate the probability of extinction is to estimate the probabilities of individual extinction risks, such as the risk from nuclear war or artificial intelligence, and then to combine them into an overall figure.

Another approach is to use more abstract arguments, for instance ones that draw from the Fermi paradox.

A few prominent academics have offered their own personal estimates of the probability that humans will go extinct or face a civilizational collapse. The philosopher Nick Bostrom has placed the odds that humans will go extinct at greater than 25%, though he doesn't specify by what date. The astrophysicist Martin Rees has placed the odds of a civilizational collapse in this century at 50%. It's unclear, however, how much can be inferred from these subjective estimates.

I think this could be incorporated here, though it's a bit outdated and superseded by better estimates.

Goth, Aidan & Stephen Clare (2020) Dr. Philip Tetlock’s forecasting research, Founders Pledge, November 27.
This report discusses plans for "work on methodological questions with an eye towards hosting a forecasting tournament focused on global catastrophic risks in summer 2021. [Tetlock and a collaborator] call this “second generation forecasting”: forecasting that predicts events over longer timescales and in the face of deep uncertainty."

https://founderspledge.com/stories/dr-philip-tetlocks-forecasting-research-high-impact-funding-opportunityGoth, Aidan & Stephen Clare (2020) Dr. Philip Tetlock’s forecasting research, Founders Pledge, November 27.
This discusses plans for "work on methodological questions with an eye towards hosting a forecasting tournament focused on global catastrophic risks in summer 2021. [Tetlock and a collaborator] call this “second generation forecasting”: forecasting that predicts events over longer timescales and in the face of deep uncertainty."

https://founderspledge.com/stories/dr-philip-tetlocks-forecasting-research-high-impact-funding-opportunity
This discusses plans for "work on methodological questions with an eye towards hosting a forecasting tournament focused on global catastrophic risks in summer 2021. [Tetlock and a collaborator] call this “second generation forecasting”: forecasting that predicts events over longer timescales and in the face of deep uncertainty."

Tonn, Bruce & Dorian Stiefel (2013) Evaluating methods for estimating existential risks, Risk Analysis, vol. 33, pp. 1772–1787.

Further readingBibliography

Beard, Simon, Thomas Rowe & James Fox (2020) An analysis and evaluation of methods currently used to quantify the likelihood of existential hazards, Futures, vol. 115, pp. 1–14.

Besiroglu, Tamay (2019) Ragnarök Series — results so far, Metaculus, October 15.

Muehlhauser, Luke (2019) How Feasible Is Long-feasible is long-range Forecasting?forecasting?, Open Philanthropy, October 10.

Ord, Toby (2020) The PrecipicePrecipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing.

Sandberg, A. & Bostrom, N. (2008) Global Catastrophic Risks Survey, Technical Report #2008-1, Future of Humanity Institute, University of Oxford.

Ragnarök Series—results so far