Existential risk

Discuss the topic on this page. Here is the place to ask questions and propose changes.
1 comments, sorted by
New Comment

Below I consider changes for this Wiki page. 

The sentence

"Existential risks include natural risks such as those posed by asteroids or supervolcanoes as well as anthropogenic risks like mishaps resulting from synthetic biology or artificial intelligence." 

is insufficient in my view in capturing what existential risks humanity faces. I believe that having the list of existential risks covered in Bruce E. Tonn's Anticipation, Sustainability, Futures and Human Extinction on the EAF Existential Risk Wiki would be substantially more helpful to EAF readers than the above sentence. 

Some or all of Tonn's explanations can be replaced or supplemented with updated and/or more comprehensive information. If those on this forum studied in existential risk choose to do away with most of Tonn's descriptions I still believe that whatever is left of risk framework below would still be a useful development for this Wiki page. 

Here is the list of existential risks in Tonn's book, without their explanations: 

I Anthropogenic: current, preventable

  • Nuclear war
  • Climate change
  • Disease

II Coupled human–environment systems: current,

  • Significant loss of biodiversity
  • Agricultural systems failure
  • Significant reduction in natural resources
  • Exceed key planetary boundaries

III Human reproduction: emerging, preventable

  • Infertility due to chemicals
  • Unintended consequences of medical advances
  • Dysgenics
  • Voluntary extinction

IV Risks to humanness: emerging, preventable

  • Evolution to posthumanism
  • Humans uploaded

V Advanced technology: emerging, preventable

  • Non-friendly Super-AIs
  • Technological Singularity

VI Natural terrestrial risks: anytime, unpreventable

  • Super volcanoes
  • Extreme ice age
  • Anoxic events

VII Solar system: anytime, unpreventable

  • Collisions with near-earth objects
  • Energy output from the sun
  • Carrington class ejection from the sun
  • Gamma ray burst
  • Near earth super or hypernova
  • Rogue black hole

VIII Extraterrestrial civilizations: anytime, unpreventable

  • Alien invasion
  • Destruction by aliens from afar
  • Other interventions by Godlike Creators
  • Other unknowns

IX Universe scale: very long term, unpreventable

  • Vacuum phase transition
  • Collision with Andromeda Galaxy
  • Expansion of the universe due to heat death
  • Collapse of the universe due to gravitational attraction

Of course, to understand some of these risk classifications adequately the context provided by Tonn in the book is needed. One of Tonn's explanations for the first risk category captures this idea:

This list leaves out several major risks to humanity that by themselves do not threaten humans with extinction. Bioterrorism involving the release of deadly microorganisms is one such threat. Unfortunately, this risk is increasing because of the increasing effectiveness of relatively low-cost, Do-
It-Yourself (DIY) kits and instructions available over the internet. More generally, the risk is increasing that weapons of mass destruction will be developed and deployed by non-state actors. I am not arguing that these potential global catastrophic risks be ignored by any means. From the perspective of human extinction, though, they could play an important role in a series of events that could lead to human extinction (See the Singular Chain of Events Scenario at the end of Chapter 4).

So, many of these existential risks might better be classified as extreme risk or GCRs, or as events that greatly increase the chance of something else resulting in extinction (a chain event) shortly after (on a geological time scale).  Should Tonn's listing be incorporated into this Wiki page, I think providing explanations next to each risk and perhaps nest to each risk category as well would be a good approach. If given permission by the community, I would begin by inserting this framework as you see it now and then would (1) link each risk to its Wikipedia page or flagship paper; (2) would provide an explanation for each risk and risk category, sometimes including the same sources as Tonn; and (3) would optimize for brevity in doing (1) and (2). 

Other than covering the actual existential risks listed on this Wiki page, I think copying some parts of the LessWrong Wiki concept page for Existential Risk (see https://www.lesswrong.com/tag/existential-risk) would be a good idea. The highest priority action I can think of would be including Bostrom's 2012 classifications of existential risks, which would coincide well with Tonn's risk framework, in my opinion.  

Bostrom2 proposes a series of classifications for existential risks:

  • Bangs - Earthly intelligent life is extinguished relatively suddenly by any cause; the prototypical end of humanity. Examples of bangs include deliberate or accidental misuse of nanotechnology, nuclear holocaust, the end of our simulation, or an unfriendly AI.
  • Crunches - The potential humanity had to enhance itself indefinitely is forever eliminated, although humanity continues. Possible crunches include an exhaustion of resources, social or governmental pressure ending technological development, and even future technological development proving an unsurpassable challenge before the creation of a superintelligence.
  • Shrieks - Humanity enhances itself, but explores only a narrow portion of its desirable possibilities. As the criteria for desirability haven't been defined yet, this category is mainly undefined. However, a flawed friendly AI incorrectly interpreting our values, a superhuman upload deciding its own values and imposing them on the rest of humanity, and an intolerant government outlawing social progress would certainly qualify.
  • Whimpers - Though humanity is enduring, only a fraction of our potential is ever achieved. Spread across the galaxy and expanding at near light-speed, we might find ourselves doomed by ours or another being's catastrophic physics experimentation, destroying reality at light-speed. A prolonged galactic war leading to our extinction or severe limitation would also be a whimper. More darkly, humanity might develop until its values were disjoint with ours today, making their civilization worthless by present values.

I do not have much more to say for now regarding this Wiki page. 

Please share your thoughts on these proposed edits. If people support them, I will make them. If people support them conditional on some further changes, I will make the update the edits and then make them. 

Thank you for reading this!

Also, pinging @Pablo given the extent of his contributions to the EAF Wiki pages.