This is a special post for quick takes by sawyer. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.
Sorted by Click to highlight new quick takes since: Today at 9:54 AM

Within EA, work on x-risk is very siloed by type of threat: There are the AI people, the bio people, etc. Is this bad, or good?

Which of these is the correct analogy?

  1. "Biology is to science as AI safety is to x-risk," or 
  2. "Immunology is to biology as AI safety is to x-risk"

EAs seem to implicitly think analogy 1 is correct: some interdisciplinary work is nice (biophysics) but most biologists can just be biologists (i.e. most AI x-risk people can just do AI).

The "existential risk studies" model (popular with CSER, SERI, and lots of other non-EA academics) seems to think that analogy 2 is correct, and that interdisciplinary work is totally critical—immunologists alone cannot achieve a useful understanding of the entire system they're trying to study, and they need to exchange ideas with other subfields of medicine/biology in order to have an impact, i.e. AI x-risk workers are missing critical pieces of the puzzle when they neglect broader x-risk studies.

Today is Asteroid Day. From the website:

Asteroid Day as observed annually on 30 June is the United Nations sanctioned day of public awareness of the risks of asteroid impacts.  Our mission is to educate the public about the risks and opportunities of asteroids year-round by hosting events, providing educational resources and regular communications to our global audience on multiple digital platforms.

I didn't know about this until today. Seems like a potential opportunity for more general communication on global catastrophic risks.

More from sawyer
Curated and popular this week
Relevant opportunities