This is a special post for quick takes by aphenix. Only they can create top-level comments. Comments here also appear on the Quick Takes page and All Posts page.
Sorted by Click to highlight new quick takes since:

Long run thoughts

Three thoughts I have been turning over on my long runs lately: 

  1. I find the EA conversation (and LessWrong and other forums) challenging to engage in, which is why this is my first post. There is substantial historical content but seemingly no context or prioritization without reading the entire backstory, the full debate, every tangential thread, every post on every writer's related blog, and then the full 27-tweet thread with all associated replies.  Where did certain ideas leave off? Which avenues are worth pursuing? Which ones have been ruled out? Who are the key players and how can we see what they are working on? Seriously, how can the next generation get up to speed like this? It seems ill-conceived that we torture them with links to websites that are no longer maintained and 500,000 word debates. I'm not saying we have to distill down to a TikTok, but surely the tools exist to help wrangle the immense amounts of information generated here and elsewhere into something more usable to a newcomer. How would expect EA to grow otherwise? It would just turn into "those nuts who stuck around long enough to be able to reference that one blog post from 2004". How can we make the knowledge transfer a little easier for the next generation and the one after that? Who is already doing a good job of shepherding newcomers and youngsters and apprentices onto an information-heavy path, and how are they doing it?
  2. Related to the above, isn't identifying what is effective in altruism a dynamic process? How often and how do we review past conclusions to assess whether our previous assumptions still hold, if there are new facts to take into account? Why is the "what's important" list often  presented as static--not a dynamic and visual dashboard, updated periodically according to changes in the environment?  
  3. Why is EA so, so, so, so white and male? That so many of the conversations around  AI and other issues that the EA community considers critical to the planetary future are dominated by and related decisions made by white men makes me (female, albeit white) extremely nervous. It's not like we have a good track record, y'all.
Curated and popular this week
Relevant opportunities