joko

29Joined Aug 2022

Posts
4

Sorted by New

Comments
3

I appreciate the context, thank you. However, two points came to mind:

  1. It seems like the purpose is quite different from the medium-sized university department you described, running workshops and retreats vs what standard academic departments do (offices, lecture halls, labs). So I'm not sure how good the comparison is
  2. You point out that in the context of university buildings, it's not a lot of money. But in the context of CEA's other spending, it does seem like a lot. CEA received $14 million in funding from FTX[1], which has been discussed a lot. So it's understandable that spending a supposedly similar amount of money on a single venue without much public explanation will raise some eyebrows.

Either way, I don't think anyone can really judge whether the investment was a good decision based on the currently availabe information. Which is why I'd appreciate a more detailed explanation from CEA.

  1. ^

    number taken from the wiki entry on CEA. I chose to use this comparison because I couldn't immediately find recent numbers for how much money CEA is spending in total, but I assume that 15 million is a significant portion of it.

Thank you for this post, looking forward to the other parts of the series! I enjoy this format of explaining how EA came to care about a specific cause area and what shaped the current understanding of the topic. I'd be interested in more "history of [cause area / some aspect of EA culture]" posts.

What is the main idea this video is trying to convey? Based on the title and description, I assumed the goal would be to introduce key ideas of longtermism/x-risks and promote WWOTF. It did the latter, but I don't think the video presents longtermist ideas in a very clear way. 

Earlier today, I watched the video with a couple of friends who have never heard about longtermism and x-risks before. It did not do a good job at sparking discussion. When talking about the video, the main takeaways were something like:

  •  civilizations have collapsed before
  •  if it happens again, we will most likely recover
  • to make sure that we actually recover, we should stop burning coal. However, everyone was already convinced that we should stop burning coal because of climate change arguments.
  • my friends were mostly confused about what longtermism is and why it is related to EA

Afterwards, I suggested reading Will's guest essay in the NYT. From my impression, that article got my friends a lot more excited about reading WWOTF and seemed to resolve the confusion about longtermism and EA. In the future, I will definitely send the NYT article to people as an introduction to longtermism or this WWOTF book review by Ali Abdaal for people who just really prefer watching videos.