Hide table of contents

Summary

We’re announcing two EA Global conferences in 2024:

  • EA Global: Bay Area (Global Catastrophic Risks) (Feb 2–4)
  • EA Global: London (May 31–June 2)

We may also run an EA Global in Boston later in 2024, or perhaps a virtual event instead (or both). We’re planning to see how the upcoming Boston and EAGxVirtual events go before making a decision.

Apply Now

EA Global Bay Area (Global Catastrophic Risks) will be our first GCR-focused conference and will:

  • Have talks, workshops, meetups, and so on solely focused on risks from AI and other GCRs, as well as cross-cause efforts such as policy and community-building that are relevant to reducing these risks.
  • Be designed for attendees working (or planning to work) on GCRs or meta/other areas (such as software engineering, law, or community building) where they expect their work to contribute to GCR reduction efforts.

As in previous years, we’ll just have one admissions process across all EA Global events and we’ll invite attendees to self-select. That is, for attendees focusing on other areas, you’ll be welcome to apply and register for the Bay Area event (and your chances of admission won't be affected), we just expect it’ll be less useful for you. We expect the London event to be similar to our past London conferences — with most attendees likely still focused on AI and GCRs, and with about 50% of our content focused in these areas.

We’re also announcing our ticketing structure for 2024, which is similar to the structure we had in 2021. The default ticket price for EA Global will be £400 GBP or $500 USD with discounts for students, people on low incomes, and select mentors.

The rest of this post explains some of the reasoning behind these decisions.

Why are we running a GCR-specific conference?

Funding

Most of our funding comes from funders who are prioritizing GCR capacity building (notably, Open Phil’s GCR Capacity Building team). Historically, we’ve not attracted substantial funding from donors prioritizing farm animal welfare and global health and wellbeing, though we’d be excited about broadening our donor base over time. Fortunately, support from the GCR funders allows us to maintain programs serving the full scope of EA, even those not directly related to global catastrophic risk reduction.

This also means that our funders are more excited about us experimenting with cause-specific events on GCRs — though we want to clarify that the Events Team is excited about pursuing this too.

We’re excited about exploring cause-specific events

As mentioned above, the Events Team is also excited about exploring cause-specific events. Over the past two years, we’ve run retreats focused on GCRs and have also been exploring a conference focused on global health and wellbeing (we’ve also lightly explored running a FAW conference). We also recently ran a retreat for folks focused on effective giving.

Like many at CEA, the majority (but not all) of the Events Team currently lean towards GCRs in their cause prioritization. Most of the team is still very excited about pursuing other cause area events as well and is also excited about big tent / principles-first events like the regular EA Global events. But, like our funders, some on the team are excited about putting particular energy into GCR and/or AI Safety events.

Historic attendee pool

Our Bay Area events have historically had a very GCR-focused (and specifically AI Safety-focused) attendee pool. Part of what we’re doing here is trying to make this more explicit, so that attendees know what to expect. Previously we’ve heard that these Bay Area events have been less valuable for non-GCR attendees as there haven’t been many people from their cause area present.

Why keep the event EA branded?

Costs to building a new brand

Since this is a test, we don’t think it’s worth doing the lift of a significant rebrand with new application portals, branding, etc. Having the event under the EA Global umbrella also means we can have one central admissions process, which saves our users from having to apply to multiple events.

We want to avoid creating a new brand unless we feel confident in it, to avoid potentially rebranding multiple times. If this event goes well, we may iterate on it the year after and perhaps create a new event brand if we think there’d be sufficient demand.

Skepticism around a GCR-branded conference

We’re skeptical that we’d get lots of attendees to come to a “GCRs Global” who wouldn’t otherwise come to an EA Global — currently the GCR field as EAs consider it isn’t a substantial area outside of EA. We think such a conference could make sense as a more introductory event, but EA Global conferences have largely been targeted at people who have engaged with these ideas substantially, and we don’t expect there to be a large pool of people who’ve engaged with GCRs who wouldn’t attend an EA Global.

We’re more excited about an AI safety (or biorisk) specific event — that seems more clearly something that would get counterfactual attendees, and is more clearly a package of ideas that would make sense to someone outside of EA. We’ve been exploring an AI safety conference and will continue to do so, though this is a bit tricky as our team aren’t AI experts and don’t have authority in this area. We expect such an event would likely involve partnering with an AI safety org to execute their vision and we’ve been talking with various orgs to explore this. We’ve also already helped to support various other (smaller scale) AI safety and biorisk events.

Ticket pricing

In a previous post we outlined our event costs and mentioned that we would likely increase ticket prices to help recoup some of our costs. We wanted to do this in such a way that wouldn’t price out the bulk of our attendees, some of whom are students, are on a low income, or are attending mostly to provide value to others at the event (in which case it makes less sense for them to pay much to attend).

As a result, we’re planning to raise our ticket prices to a similar level to what they were in 2021. We’ll have default ticket prices of £400 (or $500), with discounted options for students, people on low incomes, and select mentors. In general, we’d like everyone to pay something in order to create some sort of buy-in, so we’ll be moving towards a world where we’re more restrictive with who’s able to get a free ticket. We encourage folks to email us if they need a free ticket due to financial hardship, and we expect to grant nearly all of these requests (though we may have to backtrack here if we get inundated with requests).

Other info

  • We’re in the process of organising many EAGx events in 2024 and think it’s likely that there’ll be at least one EAGx event in the US in 2024 (as well as events in various other locations).
  • We expect the EA Global: Bay Area (GCRs) event to be pretty similar to other EA Globals; we’ll likely be keeping the food all-vegan, using Swapcard, and expect the branding and feel of the event to remain the same.
Apply Now
Comments13
Sorted by Click to highlight new comments since: Today at 11:59 AM

I love all of these decisions and reasoning and think that you should go into this direction even further. I think it might be both cheaper and more effective to mostly abandon EAGs and run smaller, more specialised events instead.

My hypothesis is that people from different cause areas don’t get much out of interacting with each other at EAGs. This hypothesis can be tested with questions on post-EAG surveys. I believe the hypothesis because I just stick to other animal advocates at EAGs since these are the people I have the most productive and work-relevant conversations. I see other animal advocates doing the same. 

Currently, I see EAGs as three or four barely related conferences running in the same building at the same time. This has drawbacks. Attendees have to navigate a bigger venue to find talks and 1-1s. More importantly, attendees are less likely to start chats with random strangers or join a random group of people talking because there are fewer common things to talk about. You’re less likely to randomly bump into the people you’d have the most productive conversations with. Or you might bump into them later in the conference which would give you less time to spend with them.

I sometimes do talk with people from other cause areas but it often goes something like this:

“What do you work on?”

“Animals, you?”

“AI”

We might part ways after that, or we might talk about food, how we got into EA, or our jobs.[1] But in whichever case, I’m unlikely to change my mind on something work-relevant or to find a new collaborator in that conversation. While it’s nice to make friends with random people, it’s not a good use of time on a weekend that costs “around $1.5k–2.5k per person.” If all EAG was like that, even providing a guide dog to a blind person for $50k seems like a better use of charity money. So these are not the interactions you want to foster.

I do think that a productive cross-pollination of ideas from different EA cause areas is possible. But smaller events that are dedicated to a specific type of interplay between cause areas might be much better at this. I and the person working on AI mentioned above might find a productive conversation topic much easier at an event like the AI, Animals, and Digital Minds conference because we will be prompted by talks and the theme to think about relevant topics. Although we might also be prompted to have a similar conversation by a talk at an EAG about it.

For other types of cross-pollination, local EA events might be better and cheaper. It’s not like I need a specific AI safety specialist who needs to fly in so I could ask them my beginner questions about AI safety (or debate prioritising animals vs AI). My local EA interested in AI would do. Making local EA friends might also be better because it can be more helpful for sustaining motivation. Also, when I was thinking about switching from animals to AI, I didn’t have time at an EAG to grab a random AI safety person and ask them beginner questions because I was busy meeting animal advocates and they were probably busy too. And I didn’t know which AI safety person to grab.

I think it’s useful to ask “did people from all around the world need to fly for this?” when considering what conversations we want to encourage at global conferences. Examples that satisfy this criterion include (1) people working on similar things in different countries learning from each other, (2) meeting the few other people in the world who work on your niche topic, (3) meeting your colleagues in person. More specialised conferences would likely make all of such conversations easier.

  1. ^

    We could also try to find out why we chose to prioritize different cause areas. And the first few such conversations can be very productive but in my experience, most people who have been in EA for a while have had too many such conversations already. I’m open to the possibility that I’m wrong about this but in that case, I’d rather organise separate smaller local events for people who are interested in cause area battles. 

I'm a little late to this thread, but I think this is very regrettable. I feel quite strongly that CEA should be building and growing a "big tent" Effective Altruism, around the core principles of EA. I think this announcement is quite corrosive to that goal. 

I strongly support cause-specific field building, but this is best suited for sister organisations and not the Centre for Effective Altruism. 

A lot of organisations in the EA community building space are underperforming, including CEA and including the organisation that I run. That's okay. We just need to make steady progress to get where we need to be. But I believe this a significant step backwards, both in terms of the core vision of CEA and it's actual output. 

I don't think I agree that CEA shouldn't be doing cause specific events, and I think that given how the past couple of Bay Area EAGs went this is a pretty natural decision.

But it does seem pretty regrettable that there'll be no cause-general EAG in the Americas next year.

Just want to clarify — it's still possible that there is a cause-general EAG in the Americas next year (I expect slightly more than 50% likely, but this number is semi-made up).

I think the crux for me, is using EAG branding for an event that doesn't represent all of Effective Altruism. If, like last year, an event will be run by CEA focusing on a particular area, I wouldn't be too concerned.

Thank you for the update and all of the work you're putting into these events. I know you're likely busy with EAG Boston, but a few questions when you have the time:

1. Is the decision to run an east coast EAG in 2024 primarily about cost? And if an east coast EAG does happen in 2024, will it definitely be in Boston vs. DC or a cheaper city?

2. If you had 2x or 3x the budget for EAGs, do you think you would organize a cause-neutral EAG in the Bay Area in addition to a GCR conference? How would more funding affect cause-specific vs. big-tent event planning?

3. Do you envision content focused on digital sentience and s-risks at the GCR conference? I'm personally worried that AI risk and biorisk are reducing the airtime for other risks (nuclear war, volcanoes, etc.), including suffering risks. Likewise, I'd still love to see GCR-oriented content focused on topics like how climate change might accelerate certain GCRs, the effects of GCRs on the global poor, the effects of GCRs on nonhuman animals, etc.

(Also, I hope all EAG events remain fully vegan, regardless of the cause area content!)

Thanks for the questions Rocky! Will try to answer them below:

1. On whether to run an east coast EAG: I'd say cost is definitely the biggest factor here, though there are other smaller factors, such as whether a third EAG gets enough unique attendees and the general question of at what point we hit diminishing returns for number of EAGs per year. Re what city it would be hosted in, my guess is that Boston is the most likely option, followed by either NYC or DC, but I'm not sure. My rough sense is that the trade-offs aren't quite worth it to do the event in a cheaper city because it likely wouldn't be sufficiently cheaper, though I'm open to it and haven't thought about this super deeply.

2. If we had a much larger budget I do think we'd at least push harder for a cause-neutral EAGx in the Bay Area (this is something we're considering anyway, though we'd need to find a team to run the event, as well as funding for it). Though with a much larger budget the thing I'd probably do first is provide more travel grants for our events, as we currently only provide these on a fairly limited basis. I'm not sure that funding would strongly affect our proportions of cause-specific vs big-tent events at this stage, especially as I see the GCR event as a test (and as such am not that keen to run two of them in one year).

3. I'm open to content on digital sentience and s-risks at the GCR EA Global, as well as some of the other sub-topics you mention — and I do expect they would be within the scope of the event. The main question would be whether there are any specific talks or sessions within those areas we're sufficiently excited about hosting (and whether we think there are high quality and eager speakers who would do these topics justice).

Why is Boston favored over DC? I'd expect DC would have more EAs in general than Boston, plus would open up valuable policy-focused angles of engagement.

The main issue is that some DC-based stakeholders have expressed concern that an EAG DC would draw unwanted attention to their work, partly because EA has negative connotations in certain policy/politics crowds. We're trying to evaluate how serious these concerns (still) are before making a decision for 2024.

I'm also curious about this. Boston is convenient to me as a Cambridge resident, but I'd guess that holding an event in DC would be more valuable.

The term "global catastrophic risk" has been defined in multiple different and mutually inconsistent ways.[1] What will the Bay Area EAG focus on, specifically? And is there a specific reason why this term was chosen instead of a less ambiguous one?

  1. ^

    That comment doesn't even include all the definitions of "global catastrophic risk" that I've seen. According to Wikipedia, "[m]ost global catastrophic risks would not be so intense as to kill the majority of life on earth, but even if one did, the ecosystem and humanity would eventually recover (in contrast to existential risks)," directly contradicting a lot of other definitions people have given, especially Open Phil.

Thanks for the comment! I expect the main cause areas represented at the Bay Area event to be AI safety, biorisk, and nuclear security. I also expect there'll be some meta-related content, including things like community building, improving decision making, and careers in policy.

We weren't sure exactly what to call this event and were torn between this name and EA Global (X-Risk). We decided on EA Global (GCRs) because it was the majority preference of the advisors we polled, and because we felt it would more fully represent the types of ideas we expect to see at the event, as nuclear security and some types of risks from advanced AI or synthetic biology may not quite be considered to be existential in nature.

Thanks for the news; I appreciated the thoughtful exploration of the costs and benefits of an EA conference focused on global catastrophic risks. I agree that the benefits outweigh the costs, and the reasoning here seems sound. Hope I can attend!

Curated and popular this week
Relevant opportunities