O

OllieBase

Community Event Manager @ Centre for Effective Altruism
6165 karmaJoined Working (0-5 years)
Interests:
Forecasting

Sequences
1

CEA Community Events Retrospective

Comments
343

  • (I remember in the early days of 80,000 Hours, we spent a whole day hosting an UHNW. He ultimately gave £5000. The week afterwards, a one-hour call with Julia Wise - a social worker at the time - resulted in a larger donation.)

 

I learn about new ways that Julia had a significant impact on this community every few months, and it never ceases to give me a sense of awe and appreciation for her selflessness. EA would not be what it is today without Julia.

Just a heads up that this was posted on April Fool's day, but it seems like a serious post. You might want to add a quick disclaimer at the top for today :)

We (the CEA Events Team) recently posted about how we cut costs for EA Global last year. That's a big contributing factor, and involved hiring someone (a production associate) to help us cut overall costs.

It seems like you're making a few slightly different points:

  1. There are much more pressing things to discuss than this question.
  2. This question will alienate people and harm the EA brand because it's too philosophical/weird.
  3. The fact that the EA Forum team chose this question given the circumstances will alienate people (kind of a mix between 1 and 2).

I'm sympathetic to 1, but disagree with 2 and 3 for the reasons I outlined in my first comment.

I disagree that we should avoid discussing topics so as to avoid putting people off this community.[1] 

  • I think some of EA's greatest contributions come from being willing to voice, discuss and seriously tackle questions that seemed weird or out of touch at the time (e.g. AI safety). If we couldn't do that, and instead remained within the overton window, I think we lose a lot of the value of taking EA principles seriously.
  • If someone finds the discussion of extinction or incredibly good/bad futures offputting, this community likely isn't for them. That happens a lot!
  1. ^

    Perhaps for some distasteful-to-almost-everyone topics, but this topic doesn't seem like that at all.

few people are thinking about how to navigate our way to a worthwhile future.


This might be true on the kinds of scales EAs are thinking about (potentially enourmous value, long time horizons) but is it not the case that many people want to steer humanity in a better direction? E.g. the Left, environmentalists, libertarians, ... ~all political movements?

I worry EAs think of this as some unique and obscure thing to think about, when it isn't.

(on the other hand, people neglect small probabilities of disastrous outcomes)

OllieBase
2
0
0
36% agree

It seems plausible to me we might be approaching a "time of perils' where total x-risk is unacceptably high and will continue to be as we develop powerful AI systems, but might decrease later since we can use AI systems to tackle x-risk (though that seems hard and risky in its own myriad ways).

Broadly think we should still prioritise avoiding catastrophes in this phase, and bet on being able to steer later but low confidence.

Walking around the conference halls this February at EAG Global in the Bay Area, the average age seemed to be in the mid-20s or so.

The average age of EAG Bay Area 2025 feedback survey respondents was 30, FYI. 

I don't think this removes the thrust of your questions, which I think are good and important questions, but people do seem to consistently underestimate the average age of EA Global attendees.

Load more