Estimation of existential risk

Discuss the topic on this page. Here is the place to ask questions and propose changes.
Sorted by

I've removed the following from the human extinction entry:

One way to estimate the probability of extinction is to estimate the probabilities of individual extinction risks, such as the risk from nuclear war or artificial intelligence, and then to combine them into an overall figure.

Another approach is to use more abstract arguments, for instance ones that draw from the Fermi paradox.

A few prominent academics have offered their own personal estimates of the probability that humans will go extinct or face a civilizational collapse. The philosopher Nick Bostrom has placed the odds that humans will go extinct at greater than 25%, though he doesn't specify by what date. The astrophysicist Martin Rees has placed the odds of a civilizational collapse in this century at 50%. It's unclear, however, how much can be inferred from these subjective estimates.

I think this could be incorporated here, though it's a bit outdated and superseded by better estimates.

It's possible that this entry is redundant since we already have entries on Existential risk and on Forecasting, so e.g. someone could just filter for both of those tags at once and get something similar to filtering for this tag.


  • People might not think to filter for two tags at once
  • People might also use a single tag/entry as a collection of posts on a topic, e.g. for sending to interesting people, and a combo of two tags doesn't seem to work properly for that purpose
  • That's all just about the tagging functionality, not the wiki functionality. This seems to me like an important and large enough topic to warrant its own entry.

The fact we have a specific entry for "AI forecasting" rather than just relying on the intersection of "AI alignment" (or whatever) and "Forecasting" seems in line with having a specific entry for this topic as well.

Some alternative name options:

  • Existential risk estimates
  • Estimation of existential risks
  • (Various permutations of these sorts of phrases)

I would prefer 'existential risk estimates' over 'estimating existential risks'.

EDIT: I realize I also prefer 'estimation of existential risks' over the two above.

Intuitively, it seems Wikipedia and other reference works tend to prefer nominalized verbs over gerundive nominalization (see here for discussion and examples of this distinction). So I would be inclined to adopt this as our general policy, though this is just based on my subjective sense of how reference works name articles than on any explicit statement, which I wasn't able to find after a few minutes of research (if anyone would like to look into this further, I'd be happy to defer to their findings).

Ok, I have no strong view, so I'll change it to "estimation of existential risks".

I think that, compared to "existential risk estimates", the new name a bit less intuitively captures posts that don't discuss the process, pros, cons, etc. of existential risk estimation but rather just give some estimates. But I think "existential risk estimates" would have the opposite problem. I think there's probably no perfectly ideal name, if we want the tag to capture both types of posts (which I currently do), but that all of these names are probably "good enough" anyway.

Ironically, I think the one thing we can now rule as dominated by other options is my original choice of "Estimating existential risks". 

I don't have time to write the text for this entry at the moment. Maybe I could in a few weeks, but not sure, and other editors should definitely feel free to go on without me!)

But I think the text could draw on some of the tagged posts and the stuff in Further reading. In particular, if I was writing this, I'd probably:

I'd also make sure to explicitly note that this is not necessarily just about extinction, and conversely that many of the tagged posts will also/only discuss estimates of less potentially extreme outcomes than existential catastrophes (e.g. GCRs).