# Summary

- I estimated and studied the probability of extinction for various types of catastrophe in the 21st century in
__this__Sheet^{[1]}. The results for the probability of extinction are in the table below.- The inputs are predictions from Metaculus’
__Ragnarok Series__^{[2]}, and guesses from me, and provided by Luisa Rodriguez__here__.

- The inputs are predictions from Metaculus’
- Relative to Toby Ord’s best guesses for the existential risk from 2021 to 2120 given in The Precipice, my analysis suggests the relative importance of:
- Artificial intelligence is similar (rounded to half an order of magnitude).
- Climate change and geoengineering, synthetic biology, and “other” is half an order of magnitude lower.
- Nuclear war is one order of magnitude higher.

Type of catastrophe (in the 21st century) | Probability of extinction (%) |
---|---|

Any* | 4.26 |

Any | 1.77 |

Artificial intelligence | 2.84 |

Climate change and geoengineering | 0.0106 |

Nanotechnology | 0.245 |

Nuclear war | 0.299 |

Synthetic biology | 0.220 |

Other | 0.695 |

## Acknowledgements

Thanks to David Denkenberger, Eli Lifland, Gregory Lewis, Misha Yagudin, Nuño Sempere, and Tamay Besiroglu.

# Methods

I calculated the probability of extinction for catastrophes in the 21st century caused by:

- Artificial intelligence.
- Climate change and geoengineering.
- Nanotechnology.
- Nuclear war.
- Synthetic biology
- Other.
- Any.
- Any*.

The results for “any” do not explicitly depend on those of the 6 1st types of catastrophe mentioned above, whereas those for “any*” are calculated assuming independence between them.

The inputs to the calculations are:

- Metaculus’ community predictions for the question of the
__Ragnarok Series__, which ask about the probability of a population loss of 10 % and 95 % for various types of catastrophes during the 21st century. - Luisa Rodriguez’ guesses for
__what is the likelihood that civilizational collapse would directly lead to human extinction (within decades)__, and one related guess of mine. - My guesses for the probability of various scenarios of major infrastructure damage and climate change.

Concretely, I calculated the probability of extinction from the sum of the following 3 products (see tab “Probability of extinction by catastrophe” of __this__ Sheet):

- The population loss being between 0 and 10 %, and the sum of the products between the probability of extinction given a population loss between 0 and 10 % under a certain scenario and its respective probability.
- The population loss being between 10 % and 95 %, and the sum of the products between the probability of extinction given a population loss between 10 % and 95 % under a certain scenario and its respective probability.
- The population loss being between 95 % and 1, and the sum of the products between the probability of extinction given a population loss between 95 % and 1 under a certain scenario and its respective probability.

## Population loss

I computed the probability of the population loss falling into each of the 3 population loss ranges presented above based on the complementary cumulative distribution function (__CCDF__) of the population loss (see tab “Probability of the population loss”). I assumed the CCDF decreases linearly between each consecutive pair of the following points^{[3]} (see tab “CCDF of the population loss”):

- Population loss of 0, and CCDF of 1, i.e. probability 1 of the catastrophe decreasing the population size.
- Population loss of 10 %, and CCDF given by the product between:
- The probability of a population loss greater than 10 %.
- The probability of such population loss being caused by a certain type of catastrophe, given that it occurred.

- Population loss of 95 %, and CCDF given by the product between:
- Population loss of 10 % caused by a certain type of catastrophe, which equals the product just above.
- The probability of a population loss of 95 % being caused by a certain type of catastrophe, given that a population loss of 10 % caused by that type of catastrophe occurred.

- Population loss of 1, and CCDF of 0, i.e. probability 0 of the population after the catastrophe being negative.

I set the probabilities required to determine the CCDF for the population losses of 10 % and 95 % to Metaculus’ community predictions (collected in tab “Metaculus' predictions”).

## Probability of extinction

I calculated the probability of extinction for each of the 3 population loss ranges presented above for 3 exhaustive scenarios (see tab “Probability of extinction by scenario”):

- Without major infrastructure damage nor (major) climate change.
- With major infrastructure damage and (major) climate change.
- With either major infrastructure damage or (major) climate change.

To illustrate what is intended by “major infrastructure damage” and “major climate change”, Luisa __writes__:

- “Major infrastructure damage”: “e.g. damaged roads, destroyed bridges, collapsed buildings, damaged power lines”.
- “Major climate change”: “e.g. nuclear winter”.

For the 1st and 2nd scenarios, I determined the probability of extinction from its mean value for each of the population loss ranges. For the 3rd one, I computed it from the geometric mean between the values for the 1st and 2nd scenarios.

I supposed the probability of extinction as a function of the population loss to increase linearly between each consecutive pair of the following points:

- Without major infrastructure damage nor climate change:
- Population loss of 0, and probability of extinction of 0.
- Population loss of 50 %, and probability of extinction of PE_1 = (0*0.0001)^0.5 = 0.
- Population loss of PL = 1 - 10^-5.5 = 99.9997 %, and probability of extinction of 50 %.
- Population loss of 99.99 %, and probability of extinction of 0.173 %, which I estimated from 1 % of the probability of extinction for the same population loss, but with major infrastructure damage and climate change.
- Population loss of 1, and probability of extinction of 1.

- With major infrastructure damage and climate change:
- Population loss of 0, and probability of extinction of 0.
- Population loss of 90 %, and probability of extinction of PE_2 = 10^-1.5 = 3.16 %.
- Population loss of 99.99 %, and probability of extinction of PE_3 = (0.1*0.3)^0.5 = 17.3 %.
- Population loss of 1, and probability of extinction of 1.

PE_1, PL, PE_2 and PE_3 are the geometric means between the lower and upper bounds of the best guesses provided by Luisa __here__:

- For the population loss of 50 % without major infrastructure damage nor climate change (PE_1):
- “Case 1: I [Luisa] think it’s exceedingly unlikely [probability “< 0.0001”, i.e. between 0 and 0.0001; see 1st table] that humanity would go extinct (within ~a generation) as a direct result of a catastrophe that causes the deaths of 50% of the world’s population, but causes no major infrastructure damage (e.g. damaged roads, destroyed bridges, collapsed buildings, damaged power lines, etc.) or extreme changes in the climate (e.g. cooling)”.

- For the probability of extinction of 50% without major infrastructure damage nor climate change (PL):
- “My [Luisa’s] best guess is that the turning point at which extinction goes from under 50% to over 50% is between 99.999% population death (80,000) and 99.9999% (8,000) population death (even before considering additional starting conditions like infrastructure damage or climate change)”.

- For the population loss of 90 % with major infrastructure damage and climate change (PE_2):
- “Case 2: I [Luisa] think it’s very unlikely [probability “between 0.01 and 0.1”] that humanity would go extinct as a direct result of a catastrophe that caused the deaths of 90% of the world’s population (leaving 800 million survivors), major infrastructure damage, and severe climate change (e.g. nuclear winter/asteroid impact)”.

- For the population loss of 99.99 % with major infrastructure damage and climate change (PE_3):
- “Case 3: I [Luisa] think it’s fairly unlikely [probability “between 0.1 and 0.3”] that humanity would go extinct as a direct result of a catastrophe that caused the deaths of 99.99% of people (leaving 800 thousand survivors), extensive infrastructure damage, and temporary climate change (e.g. a more severe nuclear winter/asteroid impact, plus the use of biological weapons)”.

## Probability of the scenarios

My guesses for the probability of each of the 3 scenarios defined in the previous section given a population loss caused by a certain type of catastrophe is in the table below (and in tab “Probability of extinction scenarios by catastrophe”). I calculated the probability for the type “other” from the mean of the probability for the other types of catastrophes (excluding “any”), and the one for the type “any” from the mean of the probability of the various types weighted by their probability of leading to a population between 95 % and 1.

Type of catastrophe (in the 21st century) | Probability of scenario given a population loss | ||
---|---|---|---|

No major infrastructure damage nor climate change | Major infrastructure damage and climate change | Either major infrastructure damage or climate change | |

Any | 29.2 % | 22.5 % | 48.3 % |

Artificial intelligence | 1/4 | 1/4 | 1/2 |

Climate change and geoengineering | 0 | 0 | 1 |

Nanotechnology | 0 | 1/3 | 2/3 |

Nuclear war | 0 | 1/3 | 2/3 |

Synthetic biology | 1 | 0 | 0 |

Other | 25.0 % | 18.3 % | 56.7 % |

# Results

The tables below contain the results for:

- The probability of the population loss ranges by type of catastrophe.
- The probability of extinction by population loss range and scenario.
- The probability of extinction by population loss and type of catastrophe.

Type of catastrophe (in the 21st century) | Probability (%) of a population loss between… | ||
---|---|---|---|

0 to 10 % | 10 % to 95 % | 95 % to 1 | |

Any* | 100 | 25.3 | 10.3 |

Any | 68.0 | 27.8 | 4.16 |

Artificial intelligence | 90.4 | 2.40 | 7.20 |

Climate change and geoengineering | 98.4 | 1.58 | 0.0160 |

Nanotechnology | 99.0 | 0.538 | 0.422 |

Nuclear war | 90.4 | 9.22 | 0.384 |

Synthetic biology | 90.4 | 8.74 | 0.864 |

Other | 92.6 | 5.67 | 1.69 |

Scenario | Probability of extinction (%) for a population loss between… | ||
---|---|---|---|

0 to 10 % | 10 % to 95 % | 95 % to 1 | |

No major infrastructure damage nor climate change | 0 | 0.0413 | 25.1 |

Major infrastructure damage and climate change | 0.176 | 2.05 | 55.1 |

Either major infrastructure damage or climate change | 0 | 0.291 | 37.2 |

Type of catastrophe (in the 21st century) | Probability of extinction (%) for a population loss between… | |||
---|---|---|---|---|

0 to 10 % | 10 % to 95 % | 95 % to 1 | 0 to 1 (total) | |

Any* | 0.180 | 0.141 | 3.95 | 4.26 |

Any | 0.0269 | 0.171 | 1.57 | 1.77 |

Artificial intelligence | 0.0397 | 0.0160 | 2.78 | 2.84 |

Climate change and geoengineering | 0 | 0.00461 | 0.00595 | 0.0106 |

Nanotechnology | 0.0580 | 0.00471 | 0.182 | 0.245 |

Nuclear war | 0.0529 | 0.0808 | 0.166 | 0.299 |

Synthetic biology | 0 | 0.00361 | 0.217 | 0.220 |

Other | 0.0298 | 0.0312 | 0.634 | 0.695 |

# Discussion

## Probability of extinction by scenario

The relative importance of major infrastructure damage and climate change decreases as the severity of the population loss increases. The ratio between the probability of extinction without major infrastructure damage nor climate change and the probability of extinction with both is (see cells F3:F5 of tab “Probability of extinction by scenario”):

- 0 for a population loss between 0 and 10 %.
- 2.02 % for a population loss between 10 % and 95 %.
- 45.5 % for a population loss between 95 % and 1.

This tendency seems correct, as the probability of extinction is 1 for a population loss of 1 regardless of infrastructure damage and climate change.

## Probability of extinction by type of catastrophe

### Comparison of absolute values with the GCRS

In the table below (and in tab “Comparison of absolute values with the GCRS”), I compare the probability of extinction by type of catastrophe in the 21st century I estimated with ones I derived from the 2008 Global Catastrophic Risks Survey (GCRS), whose results are presented in __this__ report by Anders Sandberg and Toby Ord from the __Future of Humanity Institute__^{[4]} (see tab “2008 Global Catastrophic Risks Survey”). The GCRS estimates refer to the period from 2009 to 2099, but I adjusted them to the period from 2023 to 2100 assuming constant risk. Additionally, I derived GCRS’ estimate for “other” risks assuming independence between the types of catastrophes^{[5]}.

Type of catastrophe (in the 21st century) | Probability of extinction (%) for a population loss between… | |||
---|---|---|---|---|

My analysis (%) | GCRS (%) | Absolute difference to GCRS (pp) | Relative difference to GCRS (%) | |

Any* | 4.26 | 16.5 | -12.3 | -74.2 |

Any | 1.77 | 16.5 | -14.8 | -89.3 |

Artificial intelligence | 2.84 | 4.30 | -1.46 | -34.0 |

Nanotechnology | 0.245 | 4.30 | -4.06 | -94.3 |

Nuclear war | 0.299 | 0.858 | -0.558 | -65.1 |

Synthetic biology | 0.220 | 1.72 | -1.50 | -87.2 |

Other | 0.695 | 6.46 | -5.76 | -89.2 |

My probabilities of extinction are lower than those I derived from the GCRS for all types of catastrophe. Nanotechnology has the largest relative difference, and artificial intelligence the smallest.

The GCRS did not address “climate change and geoengineering”, but my estimate of 0.0106 % is similar to:

- 10 % of the best guess of 0.1 % mentioned by Toby Ord in
__The Precipice__for the__existential risk__due to climate change from 2021 to 2120 (see Table 6.1). - 10 % of the upper bound of 0.1 %, and 10 times the best guess of 0.001 % mentioned
__here__by John Halstead for the existential risk due to climate change^{[6]}. - The upper bound of 0.01 % guessed by 80,000 Hours
__here__for the existential risk due to climate change^{[7]}.

### Comparison of priorities with The Precipice

Ultimately, what is the most relevant for prioritisation is how the various probabilities compare with each other. Having this in mind, in the table below (and in tab “Comparison of priorities with The Precipice”), I present the probability of extinction in the 21st century as a fraction of that for “any*”, and the existential risk between 2021 and 2120 guessed by Toby Ord in __The Precipice__ (see tab “Existential risk estimates from The Precipice”) as a fraction of the total. The existential risk for “other” was estimated from those for “unforeseen anthropogenic risk” and “other anthropogenic risk” assuming independence between them.

Type of catastrophe | Normalised probability of extinction for a catastrophe in the 21st century (%) | Normalised existential risk from 2021 to 2120 (%) | Ratio | Decimal logarithm of the ratio |
---|---|---|---|---|

Artificial intelligence | 66.6 | 60.0 | 1.11 | 0.0455 |

Climate change and geoengineering | 0.248 | 0.600 | 0.413 | -0.384 |

Nuclear war | 7.03 | 0.600 | 11.7 | 1.07 |

Synthetic biology | 5.17 | 20.0 | 0.259 | -0.587 |

Other | 16.3 | 31.6 | 0.516 | -0.287 |

Relative to Toby Ord’s best guesses, my analysis suggests the relative importance of:

- Artificial intelligence is similar (rounded to half an order of magnitude).
- Climate change and geoengineering, synthetic biology, and “other” is half an order of magnitude lower.
- Nuclear war is one order of magnitude higher.

The adequacy of this comparison depends on the extent to which probability of extinction is a good proxy for existential risk.

## Quality of the inputs

In essence, the results I obtained are a function of guesses from Metaculus’ forecasters, Luisa Rodriguez, and me. I should note there is margin to improve the quality of the inputs:

- Regarding Metaculus:
- Eli Lifland, Gregory Lewis, Misha Yagudin, and Nuño Sempere from
__Samotsvety Forecasting__(and presumably other superforecasters) expressed concerns about relying on Metaculus’ community predictions^{[8]}. - I also noticed these are internally inconsistent:
- The probability of a population loss greater than 95 % due to “any” catastrophe is lower than that due to artificial intelligence (4.16 % < 7.20 %; see cells B5:C5 of tab “CCDF of the population loss”).
- This leads to the probability of extinction due to “any” being lower than that due to artificial intelligence (1.77 % < 2.84 %; see cells D6:E6 of tab “Probability of extinction by catastrophe”).

- However, I do not know about other forecasts looking into population losses by catastrophe such as
__Metaculus' Ragnarok series__.

- Eli Lifland, Gregory Lewis, Misha Yagudin, and Nuño Sempere from
- Luisa’s
__analysis__is great, but “a first step toward understanding this threat from civilizational collapse — not a final or decisive one”. - I am not a forecaster, and merely based my guesses on my previous knowledge.

That being said, for the reasons outlined by Scott Alexander __here__, I believe establishing priorities based on a quantitative model with guessed inputs is often better than guessing priorities.

^{^}To clarify, the probability refers to catastrophes occurring during the 21st century, but the extinction may happen afterwards.

^{^}The results in the Sheet are updated automatically as the Metaculus’ predictions change.

^{^}This implies the probability density function (

__PDF__) of the population loss is uniform for each of the 3 ranges.^{^}^{^}This implies the GCRS’ estimates for “any*” are the same as for “any”.

^{^}“With those caveats in my mind, my best guess estimate is that the indirect risk of existential catastrophe due to climate change is on the order of 1 in 100,000, and I struggle to get the risk above 1 in 1,000. Working directly on US-China, US-Russia, India-China, or India-Pakistan relations seems like a better way to reduce the risk of Great Power War than working on climate change”. I guess John’s best guess for the total risk of existential catastrophe due to climate change is similar to John’s best guess for the indirect risk, which equals John’s upper bound for the direct risk: “I [John] construct several models of the direct extinction risk from climate change but struggle to get the risk above 1 in 100,000 over all time”.

^{^}“That said, we [80,000 Hours] still think this risk is relatively low. If climate change poses something like a 1 in 1,000,000 risk of extinction by itself, our guess is that its contribution to other existential risks is at most a few orders of magnitude higher — so something like 1 in 10,000”.

^{^}