This is a special post for quick takes by JBPDavies. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.
Sorted by Click to highlight new quick takes since: Today at 9:39 AM

Two neglected X-risks

I've been going through various listings of x-risks (and GCR's, to account for some uncertainty re: climate and nuclear) by prominent organisations, and after a brief scanning have found that the usual list includes:

  1. Nuclear War
  2. Biotechnology/Man-made Pandemics
  3. Artificial Intelligence
  4. Climate Change

 

No list that I came across (explicitly) includes either:

  1. SETI/METI
  2. Molecular Nanotechnology/Atomically Premise Manufacturing (a.k.a. 'Grey Goo').

 

I would make the case that both of these are potential X-risks, and should be taken seriously as objects of research. There appears to be credible reasons to believe that APM is not currently an urgent risk (see https://forum.effectivealtruism.org/posts/gjEbymta6w8yqNQnE/risks-from-atomically-precise-manufacturing by Michael Aird), but the same cannot be said of SETI/METI.  After reading through some longtermist/EA and non-EA community research into SETI/METI risks, it seems to be an open question as to just how probable the risk of finding/being found by hostile ETI  might be. 

In the face of deep uncertainty around a number of key questions (one of which is the small matter of how likely it is that ETI's exist and their likely proximity to us, another of which is a question of ETI behaviour which we must investigate and estimate based on a sample size of 0), I suggest that this risk be considered alongside the 'big four', both due to its potentially devastating scope and its potentially high likelihood.

We need to spend time fully researching this risk and identifying the most prudent course of action, even if the end result is that we find it can be safely discounted.

Curated and popular this week
Relevant opportunities