Hide table of contents

TL;DR: This is a written version of a talk given at EAG Bay Area 2023. It claims university EA community building can be incredibly impactful, but there are important pitfalls to avoid, such as being overly zealous, overly open, or overly exclusionary. These pitfalls can turn away talented people and create epistemic issues in the group. By understanding these failure modes, focusing on truth-seeking discussions, and being intentional about group culture, university groups can expose promising students to important ideas and help them flourish.

Introduction

Community building at universities can be incredibly impactful, but important pitfalls can make this work less effective or even net negative. These pitfalls can turn off the kind of talented people that we want in the EA community, and it's challenging to tell if you're falling into them. This post is based on a talk I gave at EAG Bay Area in early 2023[1]. If you are a new group organizer or interested in becoming one, you might want to check out this advice post. This talk was made specifically for university groups, but I believe many of these pitfalls transfer to other groups. Note, that I didn’t edit this post much and may not be able to respond in-depth to comments now.

I have been in the EA university group ecosystem for almost 7 years now. While I wish I had more rigorous data and a better idea of the effect sizes, this post is based on anecdotes from years of working with group organizers. Over the past years, I think I went from being extremely encouraging of students doing university community building and selling it as a default option for students, to becoming much more aware of risks and concerns and hence writing this talk.

I think I probably over-updated on the risks and concerns, and this led me to be less outwardly enthusiastic about the value of CB over the past year. I think that was a mistake, and I am looking forward to revitalizing the space to a happy medium. But that is a post for another day. 

Why University Community Building Can Be Impactful

Before discussing the pitfalls, I want to emphasize that I do think community building at universities can be quite high leverage. University groups can help talented people go on to have effective careers. Students are at a time in their lives when they're thinking about their priorities and how to make a change in the world. They're making lifelong friendships. They have a flexibility that people at other life stages often lack.

There is also some empirical evidence supporting the value of university groups. The longtermist capacity building team at Open Philanthropy ran a mass survey. One of their findings was that a significant portion of people working on projects they're excited about had attributed a lot of value to their university EA groups.

Common Pitfalls in University Group Organizing

While university groups can be impactful, there are several pitfalls that organizers should be aware of. In this section, I'll introduce some fictional characters that illustrate these failure modes. While the examples are simplified, I believe they capture real dynamics that can arise.

Pitfall 1: Being Overly Zealous

One common pitfall is being overly zealous or salesy when trying to convince others of EA ideas. This can come across as not genuinely engaging with people's arguments or concerns. Consider this example:

Skeptical Serena asks, "Can we actually predict the downstream consequences of our actions in the long run? Doesn't that make RCTs not useful?"

Zealous Zack[2] responds confidently, "That's a good point but even 20-year studies show this is working. There's a lot of research that has gone into it. So, it really does work!"

But Serena interjects, "But you can't possibly measure all the effects."

Zack is already talking to someone else trying to maximize the number of people he talks to. This leaves Serena feeling unheard, like her arguments were not seriously engaged with, and not wanting to spend more time there.

Salesiness contradicts the truth-seeking nature of EA. It can put off people who are more skeptical and truth-seeking themselves. Additionally, the epistemic norms of the organizer shape the norms of the whole community. If you don't intentionally create an environment for poking holes in arguments, you won't make progress on the problems we care about.

The end result is a group filled with enthusiastic but uncritical people like Agreeable Allen or Easily Persuadable Ellie. The group has selected against people like Skeptical Serena. Since first impressions are sticky, even if Serena hears better arguments later, she may be inoculated against them.

Pitfall 2: Being Overly Open

Another pitfall is being overly open in an attempt to expand EA's appeal or retain people who don't share core EA principles like impartiality and cause neutrality. For example:

A group member says, “I was personally affected by X, so I think we should prioritize it. Our group should spend time looking at X.”

Overly Open Otis responds, “Yeah, it's important to work on things you care about. People in EA disagree about lots of things, so we can spend time on this.”

This means Otis spends less time on more effective causes and outsiders feel like EA is contradictory, claiming to be about effectiveness but focusing on things we have personal attachments to.

The problem is that organizers' time is extremely limited. Being careful about what you spend time on could mean orders of magnitude more impact between causes. Being cause-neutral and impartial is also more authentic to EA. By not prioritizing spending time on high-impact areas, you leave impact on the table and give a mistaken impression of EA.

Culture and fidelity should be important considerations. Even if you're not trying to explicitly represent EA, it's hard not to implicitly do so in what you focus on. Part of what makes EA great is people being skeptical, challenging assumptions, and prioritizing. If you don't optimize for that in your group, you won't have it.

Pitfall 3: Being Overly Exclusionary

A less common pitfall, often in reaction to the previous two, is being overly exclusionary. For instance:

During a talk on existential risk, a member says, “I'm interested in working on great power conflict.”

Exclusionary Enid responds dismissively, “Nuclear risk is unlikely to actually kill everyone, so it isn't that important.”

The member silently thinks, “But it could lead to increasing the chance of x risks etc…”

Enid silently thinks, “Wel obviously he just doesn’t get the argument/isn’t reasoning well because he didn’t come to the same conclusion as the people I think are really smart”

While investigating ideas like the difference between 99% and 100% of people dying is very important for cause prioritization, there are better and worse ways to approach these conversations. Easily dismissing people thinking carefully about important topics can be off-putting.

We want open-minded people who can point out where they might be wrong. We should give people a chance. Being overly exclusionary can lead to a group where everyone has similar beliefs and are not seriously investigating their uncertainties or pressure testing their claims. This makes for bad first impressions and epistemics. There are ways to focus on the most important things without being dismissive.

Other Pitfalls

There are other pitfalls beyond the ones illustrated, like being uninformed, disorganized, or off-putting in other ways. Optimizing for easily measurable metrics like attendance can give a false sense that what you're doing is right. But cultural norms are much harder to measure than numbers. I won’t go into this fully here, but wanted to flag it.

 These Pitfalls are Sneaky

What makes all of these failure modes particularly insidious is that they're often hidden. It's hard to tell if you're falling into one because we don't like to think we're being overly exclusionary or zealous ourselves. These are often default paths if you are not paying attention to externalities. If you put someone off, they're more likely to disappear than show up in your community survey or have a one-on-one with you. You may never hear about the people you put off.

Checking Your Blind Spots

As a group organizer or community builder, ask yourself: Which of these pitfalls am I most likely falling into right now? You're probably doing at least one of these things, so really interrogate that if you think you're not.

Some diagnostic questions:

  • What metrics are you using as a group? Are you optimizing for numbers and being overly zealous?
  • Where does most of the value come from in your group? Is that where you're spending your time and energy? If not, you may be overly open.
  • Does everyone in your group believe the same thing? Can you think of times you've quickly dismissed skepticism? You might be overly exclusionary.

It's also possible to exhibit different failure modes in different situations. Mapping out your theory of change as a group can help combat some of these. What are you trying to create, and are your actions leading to that? We recommend chatting about this with mentors or other experienced organizers (such as through CEA’s support programs).

Pay attention to possible outcomes, risks, and failure modes, especially less visible externalities. If you're very new, consider taking time to learn more first or starting small. Additionally, there are many valuable things students can do besides community building, like skilling up and/or building career capital for themselves.

An Aspirational Vision

An ideal group might look like:

A core of organizers who get along well and are each knowledgeable in at least one main EA cause area. They can engage in detailed conversations with newcomers and direct them to relevant resources or people. They have strong beliefs weakly held and are good at talking to different types of people.

At the start of a semester, they host open events where people can learn about EA. Even if attendees decide it's not for them, they leave thinking the organizers were humble and reasonable.

Organizers notice who engages substantively with the ideas and offer them ways to get more involved via discussions or dinners. These newcomers meet people who take ideas seriously and who can support their intellectual development. Some become organizers themselves, while others focus on skilling up, working on projects, or exploring career options.

All activities are framed around truth-seeking and helping people reason through the key questions and their career options. There's no pressure towards particular views or paths. People build aptitudes valuable to themselves and the community.

Ways I Could Be Wrong

It's important to note some ways I could be mistaken[3]:

  • I may be overestimating the stickiness of first impressions and idea inoculation. If this effect is weaker than I believe, suboptimal groups are less concerning.
  • I may be wrong about what EA needs. These arguments assume EA needs talented people who can spot our blind spots and carefully reason about key issues. But maybe we need an all-hands-on-deck approach in a specific cause area or focus on increasing donations rather than talent as bottlenecks shift.
  • Combining social and professional networks may be the wrong approach and we should just focus on explicit talent pipelines rather than groups like this.

Conclusion

In summary, common pitfalls like being overly zealous, open, or exclusionary can make university groups less effective. First impressions matter a lot for individuals and group culture, and we may be turning away the people we most need in the community.

I'm still excited about uni groups as a way to expose talented young people to important ideas. But we should stay vigilant to these failure modes, map out our theories of change, and focus on the most impactful actions to build a healthy, truth-seeking community.

  1. ^

    There is no special reason for the timing, and I wish I posted it sooner. The reason this is coming out now is because Agustín Covarrubias used Claude to turn it into written form. 

  2. ^

     This was notably written before CEA gained a CEO named Zach ;)

  3. ^

     There are many more of these! These were just some key ones relevant to the talk

Comments2


Sorted by Click to highlight new comments since:

Thanks for making a post for this! Coincidentally (probably both causally downstream of something) I had just watched part of the EAG talk and was like "wow, this is surprisingly helpful, I really wish I had access to something like this back when I was in uni, so I could have at least tried to think seriously about plotting a course around the invisible helicopter blades, instead of what I actually did, which was avoiding it all with a ten-foot pole".

I'm pretty glad that it's an 8-minute post now instead of just a ~1-hour video.

Executive summary: University EA community building can be highly impactful, but important pitfalls like being overly zealous, open, or exclusionary can make groups less effective and even net negative.

Key points:

  1. University groups can help talented students have effective careers by shaping their priorities and connections at a pivotal time.
  2. Being overly zealous or salesy about EA ideas can put off skeptical truth-seekers and create an uncritical group.
  3. Being overly open and not prioritizing the most effective causes wastes limited organizer time and misrepresents EA.
  4. Being overly exclusionary and dismissive of people's ideas leads to insular groups with poor epistemics.
  5. These pitfalls are hard to notice as an organizer, so it's important to get outside perspectives and map your theory of change.
  6. An ideal group focuses on truth-seeking discussions, engaging substantively with newcomers, and helping people reason through key questions and career options without pressure.

 

 

This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.

Curated and popular this week
 ·  · 5m read
 · 
[Cross-posted from my Substack here] If you spend time with people trying to change the world, you’ll come to an interesting conundrum: Various advocacy groups reference previous successful social movements as to why their chosen strategy is the most important one. Yet, these groups often follow wildly different strategies from each other to achieve social change. So, which one of them is right? The answer is all of them and none of them. This is because many people use research and historical movements to justify their pre-existing beliefs about how social change happens. Simply, you can find a case study to fit most plausible theories of how social change happens. For example, the groups might say: * Repeated nonviolent disruption is the key to social change, citing the Freedom Riders from the civil rights Movement or Act Up! from the gay rights movement. * Technological progress is what drives improvements in the human condition if you consider the development of the contraceptive pill funded by Katharine McCormick. * Organising and base-building is how change happens, as inspired by Ella Baker, the NAACP or Cesar Chavez from the United Workers Movement. * Insider advocacy is the real secret of social movements – look no further than how influential the Leadership Conference on Civil Rights was in passing the Civil Rights Acts of 1960 & 1964. * Democratic participation is the backbone of social change – just look at how Ireland lifted a ban on abortion via a Citizen’s Assembly. * And so on… To paint this picture, we can see this in action below: Source: Just Stop Oil which focuses on…civil resistance and disruption Source: The Civic Power Fund which focuses on… local organising What do we take away from all this? In my mind, a few key things: 1. Many different approaches have worked in changing the world so we should be humble and not assume we are doing The Most Important Thing 2. The case studies we focus on are likely confirmation bias, where
 ·  · 2m read
 · 
I speak to many entrepreneurial people trying to do a large amount of good by starting a nonprofit organisation. I think this is often an error for four main reasons. 1. Scalability 2. Capital counterfactuals 3. Standards 4. Learning potential 5. Earning to give potential These arguments are most applicable to starting high-growth organisations, such as startups.[1] Scalability There is a lot of capital available for startups, and established mechanisms exist to continue raising funds if the ROI appears high. It seems extremely difficult to operate a nonprofit with a budget of more than $30M per year (e.g., with approximately 150 people), but this is not particularly unusual for for-profit organisations. Capital Counterfactuals I generally believe that value-aligned funders are spending their money reasonably well, while for-profit investors are spending theirs extremely poorly (on altruistic grounds). If you can redirect that funding towards high-altruism value work, you could potentially create a much larger delta between your use of funding and the counterfactual of someone else receiving those funds. You also won’t be reliant on constantly convincing donors to give you money, once you’re generating revenue. Standards Nonprofits have significantly weaker feedback mechanisms compared to for-profits. They are often difficult to evaluate and lack a natural kill function. Few people are going to complain that you provided bad service when it didn’t cost them anything. Most nonprofits are not very ambitious, despite having large moral ambitions. It’s challenging to find talented people willing to accept a substantial pay cut to work with you. For-profits are considerably more likely to create something that people actually want. Learning Potential Most people should be trying to put themselves in a better position to do useful work later on. People often report learning a great deal from working at high-growth companies, building interesting connection
 ·  · 17m read
 · 
TL;DR Exactly one year after receiving our seed funding upon completion of the Charity Entrepreneurship program, we (Miri and Evan) look back on our first year of operations, discuss our plans for the future, and launch our fundraising for our Year 2 budget. Family Planning could be one of the most cost-effective public health interventions available. Reducing unintended pregnancies lowers maternal mortality, decreases rates of unsafe abortions, and reduces maternal morbidity. Increasing the interval between births lowers under-five mortality. Allowing women to control their reproductive health leads to improved education and a significant increase in their income. Many excellent organisations have laid out the case for Family Planning, most recently GiveWell.[1] In many low and middle income countries, many women who want to delay or prevent their next pregnancy can not access contraceptives due to poor supply chains and high costs. Access to Medicines Initiative (AMI) was incubated by Ambitious Impact’s Charity Entrepreneurship Incubation Program in 2024 with the goal of increasing the availability of contraceptives and other essential medicines.[2] The Problem Maternal mortality is a serious problem in Nigeria. Globally, almost 28.5% of all maternal deaths occur in Nigeria. This is driven by Nigeria’s staggeringly high maternal mortality rate of 1,047 deaths per 100,000 live births, the third highest in the world. To illustrate the magnitude, for the U.K., this number is 8 deaths per 100,000 live births.   While there are many contributing factors, 29% of pregnancies in Nigeria are unintended. 6 out of 10 women of reproductive age in Nigeria have an unmet need for contraception, and fulfilling these needs would likely prevent almost 11,000 maternal deaths per year. Additionally, the Guttmacher Institute estimates that every dollar spent on contraceptive services beyond the current level would reduce the cost of pregnancy-related and newborn care by three do