O

OllieRodriguez

AI Events Program Lead @ Centre for Effective Altruism
6913 karmaJoined Working (6-15 years)

Bio

Formerly Ollie Base (until August 2025)

Posts
40

Sorted by New

Sequences
1

CEA Community Events Retrospective

Comments
375

Thanks Jan, I appreciate the pushback.

Just wanted to flag the group is heavily selected for belief alignment with something like "EA/Constellation/Trajan House" views

As an event focused on x-risk, yes, I think this is fair.

"AI enabled human takeovers" was promoted as agenda to prioritize in multiple widely read memos by high statues people in the community  (which the organisers prioritized in the reading list). 

It's true that:

  • The agenda featured some talks emphasising risks from AI-enabled human takeover.
  • Some of the most popular memos also emphasised this risk category.
  • Some people took the survey after reading the agenda and the memos.

But I don’t think attendees were as strongly influenced as you seem to imply:

  • We highlighted some memos to read at the beginning, but soon after launching the memo platform, we prioritized memos by votes from attendees. Memos making the case for more emphasis on risks from aligned AI were heavily upvoted, and some memos that we highlighted from the beginning received fewer upvotes.
  • The survey was in part motivated by disagreements I'd heard about how the AIS community was allocating resources. While I'm sure some attendees were influenced by information they recently encountered, many will have thought about these questions in advance of the survey.
  • I don't have the full data, but I think it's likely that many attendees completed the survey before engaging with the memos and before the full agenda was published. 

I do think you're pointing to a real effect to be aware of, and thanks for pointing it out, but I don't think it's as significant as you make out (though maybe you don't think it's super significant).

Results are framed as "leaders and key thinkers in the x-risk and AI safety communities agree"

I think the areas of broad consensus accurately (if roughly) reflect the data we have here and what we saw in memos. FWIW, my overall takeaway from running this survey is that leaders and key thinkers have a wide range of views and I think this post captures and conveys this.

OllieRodriguez
2
0
0
20% ➔ 10% agree

I take "AGI goes well" to imply a wealthy and technologically advanced society. I think that could mean:

- Very cheap and delicious meat alternatives.
- Factory farming waning as it reaches inefficiencies and bottlenecks, not able to compete with the above.
- More demand for higher-welfare options like free-range and local produce.

But it also seems possible that we "lock in" factory farming and scale it further and that AGI adopts speciesist views.

Very uncertain, I don't find myself strongly disagreeing with claims across the spectrum.

I think most answers here are missing what seems the most likely explanation to me: the people who are motivated by EA principles to engage with politics are not public about their motivations or affiliations with EA. Not just because the EA brand is disliked by some political groups, but it seems generally wise to avoid having strong idealogical identities in politics beyond motivations like "do better for my constituents".

Cool! For a brief second, I thought this post was going to be an extremely long list of the trillions of senting beings my actions could influence, but this is much more digestable.

Quick take:

  • Yes, the sample I looked at did have some very expensive retreats, and I think you can run much cheaper ones. Note though that EAGx events / conferences can also be much cheaper so you should adjust on both sides (I think I had a cost-inflated sample due to high spending on community building in 2022)
  • I still think outcomes-per-person at retreats often don't seem that different to larger events, so returns to scale are often real. i.e. focus on cost-per-attendee. If your theory of change involves helping lots of people you don't know well find careers in EA/AIS, I think going bigger is usually a good move.
  • Retreats are definitely a useful intervention, especially when you have a smaller group whose needs/goals you know well (e.g. more involved community members looking to go deeper on topics).

I had a reminder to check back on this. I had a quick scan, and I don't think this happened. Joe's post probably meets the bar, and does suggest it's still a contentious issue, but I can't find 9+ more so not as contentious as you predicted :)

Starting afresh seems like the right move here, and I think it's super commendable to share that you're re-committing. 

I have the same problem when it comes to end of year donations, and that prompted me to move to monthly donations (even if the idealized version of me would save accordingly and then make bigger donations more thoughtfully EOY).

Also:

In total, I’ve given about half of what I pledged since 2016.

This is still a lot of money, and a lot of good. Giving 5% of your income to charity for almost 10 years is a hugely generous and selfless thing to do :)

Cool! FYI when I open your home page on a large monitor, the "Log In" and "Blog" buttons overlap with each other (fine on small screens).

Load more