Hide table of contents

TL;DR: This is a written version of a talk given at EAG Bay Area 2023. It claims university EA community building can be incredibly impactful, but there are important pitfalls to avoid, such as being overly zealous, overly open, or overly exclusionary. These pitfalls can turn away talented people and create epistemic issues in the group. By understanding these failure modes, focusing on truth-seeking discussions, and being intentional about group culture, university groups can expose promising students to important ideas and help them flourish.


Community building at universities can be incredibly impactful, but important pitfalls can make this work less effective or even net negative. These pitfalls can turn off the kind of talented people that we want in the EA community, and it's challenging to tell if you're falling into them. This post is based on a talk I gave at EAG Bay Area in early 2023[1]. If you are a new group organizer or interested in becoming one, you might want to check out this advice post. This talk was made specifically for university groups, but I believe many of these pitfalls transfer to other groups. Note, that I didn’t edit this post much and may not be able to respond in-depth to comments now.

I have been in the EA university group ecosystem for almost 7 years now. While I wish I had more rigorous data and a better idea of the effect sizes, this post is based on anecdotes from years of working with group organizers. Over the past years, I think I went from being extremely encouraging of students doing university community building and selling it as a default option for students, to becoming much more aware of risks and concerns and hence writing this talk.

I think I probably over-updated on the risks and concerns, and this led me to be less outwardly enthusiastic about the value of CB over the past year. I think that was a mistake, and I am looking forward to revitalizing the space to a happy medium. But that is a post for another day. 

Why University Community Building Can Be Impactful

Before discussing the pitfalls, I want to emphasize that I do think community building at universities can be quite high leverage. University groups can help talented people go on to have effective careers. Students are at a time in their lives when they're thinking about their priorities and how to make a change in the world. They're making lifelong friendships. They have a flexibility that people at other life stages often lack.

There is also some empirical evidence supporting the value of university groups. The longtermist capacity building team at Open Philanthropy ran a mass survey. One of their findings was that a significant portion of people working on projects they're excited about had attributed a lot of value to their university EA groups.

Common Pitfalls in University Group Organizing

While university groups can be impactful, there are several pitfalls that organizers should be aware of. In this section, I'll introduce some fictional characters that illustrate these failure modes. While the examples are simplified, I believe they capture real dynamics that can arise.

Pitfall 1: Being Overly Zealous

One common pitfall is being overly zealous or salesy when trying to convince others of EA ideas. This can come across as not genuinely engaging with people's arguments or concerns. Consider this example:

Skeptical Serena asks, "Can we actually predict the downstream consequences of our actions in the long run? Doesn't that make RCTs not useful?"

Zealous Zack[2] responds confidently, "That's a good point but even 20-year studies show this is working. There's a lot of research that has gone into it. So, it really does work!"

But Serena interjects, "But you can't possibly measure all the effects."

Zack is already talking to someone else trying to maximize the number of people he talks to. This leaves Serena feeling unheard, like her arguments were not seriously engaged with, and not wanting to spend more time there.

Salesiness contradicts the truth-seeking nature of EA. It can put off people who are more skeptical and truth-seeking themselves. Additionally, the epistemic norms of the organizer shape the norms of the whole community. If you don't intentionally create an environment for poking holes in arguments, you won't make progress on the problems we care about.

The end result is a group filled with enthusiastic but uncritical people like Agreeable Allen or Easily Persuadable Ellie. The group has selected against people like Skeptical Serena. Since first impressions are sticky, even if Serena hears better arguments later, she may be inoculated against them.

Pitfall 2: Being Overly Open

Another pitfall is being overly open in an attempt to expand EA's appeal or retain people who don't share core EA principles like impartiality and cause neutrality. For example:

A group member says, “I was personally affected by X, so I think we should prioritize it. Our group should spend time looking at X.”

Overly Open Otis responds, “Yeah, it's important to work on things you care about. People in EA disagree about lots of things, so we can spend time on this.”

This means Otis spends less time on more effective causes and outsiders feel like EA is contradictory, claiming to be about effectiveness but focusing on things we have personal attachments to.

The problem is that organizers' time is extremely limited. Being careful about what you spend time on could mean orders of magnitude more impact between causes. Being cause-neutral and impartial is also more authentic to EA. By not prioritizing spending time on high-impact areas, you leave impact on the table and give a mistaken impression of EA.

Culture and fidelity should be important considerations. Even if you're not trying to explicitly represent EA, it's hard not to implicitly do so in what you focus on. Part of what makes EA great is people being skeptical, challenging assumptions, and prioritizing. If you don't optimize for that in your group, you won't have it.

Pitfall 3: Being Overly Exclusionary

A less common pitfall, often in reaction to the previous two, is being overly exclusionary. For instance:

During a talk on existential risk, a member says, “I'm interested in working on great power conflict.”

Exclusionary Enid responds dismissively, “Nuclear risk is unlikely to actually kill everyone, so it isn't that important.”

The member silently thinks, “But it could lead to increasing the chance of x risks etc…”

Enid silently thinks, “Wel obviously he just doesn’t get the argument/isn’t reasoning well because he didn’t come to the same conclusion as the people I think are really smart”

While investigating ideas like the difference between 99% and 100% of people dying is very important for cause prioritization, there are better and worse ways to approach these conversations. Easily dismissing people thinking carefully about important topics can be off-putting.

We want open-minded people who can point out where they might be wrong. We should give people a chance. Being overly exclusionary can lead to a group where everyone has similar beliefs and are not seriously investigating their uncertainties or pressure testing their claims. This makes for bad first impressions and epistemics. There are ways to focus on the most important things without being dismissive.

Other Pitfalls

There are other pitfalls beyond the ones illustrated, like being uninformed, disorganized, or off-putting in other ways. Optimizing for easily measurable metrics like attendance can give a false sense that what you're doing is right. But cultural norms are much harder to measure than numbers. I won’t go into this fully here, but wanted to flag it.

 These Pitfalls are Sneaky

What makes all of these failure modes particularly insidious is that they're often hidden. It's hard to tell if you're falling into one because we don't like to think we're being overly exclusionary or zealous ourselves. These are often default paths if you are not paying attention to externalities. If you put someone off, they're more likely to disappear than show up in your community survey or have a one-on-one with you. You may never hear about the people you put off.

Checking Your Blind Spots

As a group organizer or community builder, ask yourself: Which of these pitfalls am I most likely falling into right now? You're probably doing at least one of these things, so really interrogate that if you think you're not.

Some diagnostic questions:

  • What metrics are you using as a group? Are you optimizing for numbers and being overly zealous?
  • Where does most of the value come from in your group? Is that where you're spending your time and energy? If not, you may be overly open.
  • Does everyone in your group believe the same thing? Can you think of times you've quickly dismissed skepticism? You might be overly exclusionary.

It's also possible to exhibit different failure modes in different situations. Mapping out your theory of change as a group can help combat some of these. What are you trying to create, and are your actions leading to that? We recommend chatting about this with mentors or other experienced organizers (such as through CEA’s support programs).

Pay attention to possible outcomes, risks, and failure modes, especially less visible externalities. If you're very new, consider taking time to learn more first or starting small. Additionally, there are many valuable things students can do besides community building, like skilling up and/or building career capital for themselves.

An Aspirational Vision

An ideal group might look like:

A core of organizers who get along well and are each knowledgeable in at least one main EA cause area. They can engage in detailed conversations with newcomers and direct them to relevant resources or people. They have strong beliefs weakly held and are good at talking to different types of people.

At the start of a semester, they host open events where people can learn about EA. Even if attendees decide it's not for them, they leave thinking the organizers were humble and reasonable.

Organizers notice who engages substantively with the ideas and offer them ways to get more involved via discussions or dinners. These newcomers meet people who take ideas seriously and who can support their intellectual development. Some become organizers themselves, while others focus on skilling up, working on projects, or exploring career options.

All activities are framed around truth-seeking and helping people reason through the key questions and their career options. There's no pressure towards particular views or paths. People build aptitudes valuable to themselves and the community.

Ways I Could Be Wrong

It's important to note some ways I could be mistaken[3]:

  • I may be overestimating the stickiness of first impressions and idea inoculation. If this effect is weaker than I believe, suboptimal groups are less concerning.
  • I may be wrong about what EA needs. These arguments assume EA needs talented people who can spot our blind spots and carefully reason about key issues. But maybe we need an all-hands-on-deck approach in a specific cause area or focus on increasing donations rather than talent as bottlenecks shift.
  • Combining social and professional networks may be the wrong approach and we should just focus on explicit talent pipelines rather than groups like this.


In summary, common pitfalls like being overly zealous, open, or exclusionary can make university groups less effective. First impressions matter a lot for individuals and group culture, and we may be turning away the people we most need in the community.

I'm still excited about uni groups as a way to expose talented young people to important ideas. But we should stay vigilant to these failure modes, map out our theories of change, and focus on the most impactful actions to build a healthy, truth-seeking community.

  1. ^

    There is no special reason for the timing, and I wish I posted it sooner. The reason this is coming out now is because Agustín Covarrubias used Claude to turn it into written form. 

  2. ^

     This was notably written before CEA gained a CEO named Zach ;)

  3. ^

     There are many more of these! These were just some key ones relevant to the talk





More posts like this

Sorted by Click to highlight new comments since:

Thanks for making a post for this! Coincidentally (probably both causally downstream of something) I had just watched part of the EAG talk and was like "wow, this is surprisingly helpful, I really wish I had access to something like this back when I was in uni, so I could have at least tried to think seriously about plotting a course around the invisible helicopter blades, instead of what I actually did, which was avoiding it all with a ten-foot pole".

I'm pretty glad that it's an 8-minute post now instead of just a ~1-hour video.

Executive summary: University EA community building can be highly impactful, but important pitfalls like being overly zealous, open, or exclusionary can make groups less effective and even net negative.

Key points:

  1. University groups can help talented students have effective careers by shaping their priorities and connections at a pivotal time.
  2. Being overly zealous or salesy about EA ideas can put off skeptical truth-seekers and create an uncritical group.
  3. Being overly open and not prioritizing the most effective causes wastes limited organizer time and misrepresents EA.
  4. Being overly exclusionary and dismissive of people's ideas leads to insular groups with poor epistemics.
  5. These pitfalls are hard to notice as an organizer, so it's important to get outside perspectives and map your theory of change.
  6. An ideal group focuses on truth-seeking discussions, engaging substantively with newcomers, and helping people reason through key questions and career options without pressure.



This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.

Curated and popular this week
Relevant opportunities