Hide table of contents
Parent Topic: Existential risk
You are viewing revision 1.9.0, last edited by Pablo

Human extinction is the destruction of all members of the species Homo sapiens.

Premature human extinction is one of several scenarios for humanity’s long-term future. If we accept that extinction is very important to avoid, then it is important to judge how likely it is. Also, if it appears that human extinction is inevitable, then that would be a reason to focus more on short-term impacts when trying to do good.

One way to estimate the probability of extinction is to estimate the probabilities of individual extinction risks, such as the risk from nuclear war or artificial intelligence, and then to combine them into an overall figure.

Another approach is to use more abstract arguments, for instance ones that draw from the Fermi paradox.

A few prominent academics have offered their own personal estimates of the probability that humans will go extinct or face a civilizational collapse. The philosopher Nick Bostrom has placed the odds that humans will go extinct at greater than 25%, though he doesn't specify by what date. The astrophysicist Martin Rees has placed the odds of a civilizational collapse in this century at 50%. It's unclear, however, how much can be inferred from these subjective estimates.

...

(Read more)

Posts tagged Human extinction

Relevance
131
arvomm
· 6mo ago · 22m read
76
· 2y ago · 8m read