In October of 2018, I developed a question series on Metaculus related to extinction events spanning risks from nuclear war, bio-risk, risks from climate change and geo-engineering, Artificial Intelligence risk, and risks from nanotechnology failure modes. Since then, these questions have accrued over 3,000 predictions (ETA: as of today, there the number is around 5,000).
Catastrophes were defined as a reduction in the human population of at least 10% in any period of 5 years or less. (Near) extinction is defined as an event that reduces the human population by at least 10% within 5 years, and by at least 95% within 25 years.
Here's a summary of the results as they stand today (September 24, 2023), ordered by risk of near extinction:
Global catastrophic risk | Chance of catastrophe by 2100 | Chance of (near) extinction by 2100 |
Artificial Intelligence | 6.16% | 3.39% |
Other risks | 1.52% | 0.13% |
Biotechnology or bioengineered pathogens | 1.52% | 0.07% |
Nuclear war | 2.86% | 0.06% |
Nanotechnology | 0.02% | 0.01% |
Climate change or geo-engineering | 0.00% | 0.00% |
Natural pandemics | 0.62% | N/A |
These predictions are generated by aggregating forecasters' individual predictions based on their track records. Specifically, the predictions are weighted by a function of the forecasters' level of 'skill', where 'skill' is estimated with data on relative performance on a number (typically many hundreds) of resolved forecasts.
If we assume that these events are independent, the predictions suggest that there's at a ~17% chance of catastrophe, and a ~1.9% chance of (near) extinction by the end of the century. Admittedly, independence is likely to be an inappropriate assumption, since, for example, some catastrophes could exacerbate other global catastrophic risks.[1]
Interestingly, the predictions indicate that although nuclear risk and bioengineered pathogens are most likely to result in a major catastrophe, an AI failure mode is by far the biggest source of extinction-level risk—it is at least 5-times more likely to cause near extinction than all other risks combined.
Links to all the questions on which these predictions are based may be found here.
For reference, these were the estimates when I first posted this (19 Jun 2022):
Global catastrophic risk | Chance of catastrophe by 2100 | Chance of (near) extinction by 2100 |
Artificial Intelligence | 3.06% | 1.56% |
Other risks | 1.36% | 0.11% |
Biotechnology or bioengineered pathogens | 2.21% | 0.07% |
Nuclear war | 1.87% | 0.06% |
Nanotechnology | 0.17% | 0.06% |
Climate change or geo-engineering | 0.51% | 0.01% |
Natural pandemics | 0.51% | n/a |
Thanks for sharing this summary! I think these questions and forecasts are a useful resource.
For anyone who wants to see more forecasts of existential risks (or similarly extreme outcomes), I made a database of all the ones I'm aware of. (People can also suggest additions to that. And it includes a link to these Metaculus forecasts.) And here's a short talk in which I introduce the database and overview the importance and challenges of estimating existential risk.
You may very well already be aware of this (I didn't look at your linked post closely), but Elicit IDE has a "Search Metaforecast database" tool to search forecasts on several sites that may be helpful to your existential risk forecast database project. Here are the first 120 results for "existential risk."