Hide table of contents

TL;DR: The EA movement should not use techniques that alienate people from the EA community as a whole if they do not align with a particular subgroup within the community. These approaches not only have an immediate negative impact on the EA community, but also have long-term repercussions on the sub-community utilizing them. Right now, the EA movement uses these sorts of tactics too often.

People connect with the EA movement through many different channels, and often encounter sub-communities before they have a full understanding of the movement and the wide variety of opinions and viewpoints within it. These sub-communities can sometimes make the mistake of using "polarizing techniques". By this, I mean strategies that alienate people or burn bridges with the broader community. This could be from pushing a sub-perspective too hard, or being aggressively dismissive of other views.

An example of this might be if I met a talented person at a party and they said they wanted to change to an impactful career, but had never heard of EA. If I then proceeded to aggressively push founding a charity through Charity Entrepreneurship (the organization) as a career path to them, to the point where they got turned off of EA altogether if they don’t come on board with my claims, I would consider that a polarizing approach: either they choose charity entrepreneurship as a path, or they don’t engage with effective altruism at all. Note that in the short term, all Charity Entrepreneurship really measures impact-wise is how many great charities get started, so a good person going into policy due to me connecting them to Probably Good means nothing to our organizational impact. Taken to an extreme, it might be worth pushing quite hard if I think that founding nonprofits is many times more important as a career path than policy. However, I think this style hurts both the community and Charity Entrepreneurship long-term. 

This phenomenon occurs across a diverse range of people, both in terms of funding and career transitions. Most often, it revolves around cause prioritization. It can be disappointing when someone does not share your enthusiasm for your preferred causes, but there is still a lot of value in directing them to the most impactful path they would in fact consider pursuing. 

Why it Hurts the Community

The clearest way this technique is damaging, is that turning off someone from one part of the community often demotivates them to engage positively in other parts of the community. It makes them more likely to become an active critic instead of a neutral or contributing member of a different sub-community, or to the philosophy of effective altruism as a whole.

Different sub-communities look for different types of people and resources. It's difficult for one person to have a bird's eye view on all sub-communities within EA, and it’s easy to overvalue your own community's certain needs or strengths. On numerous occasions, I have witnessed instances where one sub-community dismisses individuals possessing skills that would be immensely valuable in another segment of the community. It seems worth exercising a degree of modesty in determining who exactly is a good fit for the community as a whole.

The EA community is diverse in its viewpoints, cause areas, and approaches, and this diversity brings many benefits. The most promising career, the most neglected cause area, or what seems to be missing most from the community has changed many times over the years. Even if someone has a comprehensive understanding of all EA sub-communities, accurately predicting their long-term needs can be challenging. It is not uncommon for a community to inadvertently alienate a group that eventually emerges as crucial in the years to come.

Why it Hurts the Sub-Community Long Term

People talk. Generally, when someone has a negative experience with a community, or feels as though they were treated badly for not conforming with a specific view, they share that perception with others in their network. Maybe the person who was alienated from a sub-community was not a great fit, but their friends, colleagues, and broader community could potentially be a hugely beneficial match. A lot of people are introduced to EA via personal networks; they can be turned off in the exact same way.

Defecting begets defection: If one sub-community in EA uses polarizing techniques, it increases the odds of others doing the same. There is a lot more pressure for me to hard-sell charity entrepreneurship as a career path to someone if everyone else is doing the same for their preferred career path. This can result in a decline in impact, as individuals become disengaged or simply opt for the first career path they encounter instead of the one that aligns best with their skills and goals.

People sometimes join the EA movement through one sub-community, and then switch to another. Even if you assign 0 value for members of other sub-communities (which already seems like a bad norm to have), it is very common for people to engage in one area for years before switching to another area. These people getting hard-sold a cause early might reduce the chance of this happening and cause them to drift away from EA altogether.

In Conclusion

I think it should be a given that if someone encounters the EA movement but is not the right fit for the sub-community they first encounter, that community should be kind and cooperative. Guiding individuals towards the most suitable areas where they are likely to find a good fit, rather than adopting an aggressive dismissive attitude or forcefully pushing them down a path that contradicts their goals, would be immensely advantageous for the entire community.

Some ways that CE has done this that I think are pretty easily replicable include

  • Getting people to map out the impact of alternative comparative career paths during our program to deeply consider their counterfactuals before launching a charity
  • Connecting people who get far into our process to other job opportunities, even outside of our cause areas
  • Recommending CE as not the right choice for someone despite them getting into the program based on their values and counterfactuals
  • Connecting funders to other philanthropists who are interested in the same area even in areas we are not working in
  • When a funder/founder is not accepted into a program, we try hard to leave them with a positive view on EA/the broader charity world through sending them resources and information about alternative career paths.





More posts like this

Sorted by Click to highlight new comments since:

At EA for Christians, we often interact with people who are altruistic and focused on impact but do not want to associate with EA because of its perceived anti-religion ethos.

On the flip side, since becoming involved with EA for Christians, a number of people have told me they are Christian but keep it quiet for fear it will damage their career prospects.

And to add another way the anti-religion ethos is harmful, people may not be comfortable talking to their Christian friends about EA (or even about topics considered aligned with EA) in the first place.

Thanks! I agree with this post. 

I notice that I want to reframe this more positively, as "If you meet someone who is not a good fit for your approach to doing good, you should try to signpost them to communities/organizations that are a better fit". (But maybe that's saying something importantly different from your point?)

This is close to what I am saying, but I might phrase it stronger. For example, a large donor may consistently be a potential fit for your field, but I still believe it's important to be considerate about how far you push them. Similarly, a highly talented individual might require more than just signposting; they also should not be perceived as second-class or unintelligent for having a different viewpoint.

Related: this excellent article on generous exclusion by Priya Parker with specific examples of how to do this: https://www.priyaparker.com/art-of-gathering-newsletter/why-the-more-is-not-always-the-merrier

I recently had a conversation with a local EA community builder. Like many local community builders they got their funding from CEA. They told me that their continued funding was conditioned on scoring high on the metric of how many people they directed towards long-term-ism career paths. 

If this is in fact how CEA operates, then I think this bad, for because of the reasons described in this post. Even though I'm in AI Safety I value EA being about more than X-risk prevention.

Hey Linda,

I'm head of CEA's groups team. It is true that we care about career changes - and it is true that our funders care about career changes.  However it is not true that this is the only thing that we care about. There are lots of other things we value, for example grant-recipients have started effective institutions, set up valuable partnerships, engaged with public sector and philanthropic bodies. This list is not exhaustive! We also care about the welcomingness of groups, and we care about groups not using "polarizing techniques".

In terms of longtermist pressure - I have recently written a post about why we believe in a principles first approach, rather than an explicitly longtermist route.

I have heard similar sentiments as Linda from multiple sources, including some community builders, so I am wondering if there might be some miscommunication going on. 
Could you give some concrete examples to help clarify this? For example, how does the CEA groups team value e.g. 1 person going to work for a GiveWell top charity vs 1 person going to work for a top AI charity?

Hey Miri,

Typically, unless someone is donating large amounts of money - we would interpret direct work as more valuable. But all of these things have a scale, and there is a qualitative part to the interpretation. With donations, this is especially obvious - where it is very measurably true that some people are able to donate much more than others. However there is also an element of this with careers, where some people are able to have a huge impact with their careers, and others have smaller impact (yet still large in absolute terms). Because there are a lot of sensitive, qualitative judgement calls - we can't provide full reasoning transparency.

Hey Rob,
Thank you so much for your answer, it´s really interesting to learn more about this. I understand that there are good reasons to not provide full reasoning transparency, but if these judgment calls are being made and underpin the work of CEA’s groups team, that seems very relevant for the EA movement.
Do I interpret your comment correctly, that the CEA groups team does have an internal qualitative ranking, but you are not able to share it publicly? So different values could be assigned to the same resources, like a theoretical comparison of two people taking the same job for two different organisations?

If these judgment calls are being made and underpin the work of CEA’s groups team, that seems very relevant for the EA movement.

I agree. We're working on increasing transparency - expect to see more posts on this in the future


Do I interpret your comment correctly, that the CEA groups team does have an internal qualitative ranking, but you are not able to share it publicly?

I'm not 100% clear what you mean here, so I've taken a few guesses, and answered all of them

  • Do we have a qualitative ranking of the grants we've made: No. We are interested in making the "fund/don't fund" decision - and as such a qualitative ranking within everyone we've funded doesn't help us. We do have a list of the funding decisions we've made, and notes on the reasons why these decisions were made. These often involve qualitative judgements. We will sometimes look back at past decisions, to help us calibrate. 
  • Do we have a qualitative ranking of common actions community members might take: No. We don't have an X such that we could say "<job category> is worth X% of <job category>, holding 'ability' constant" for common EA jobs. Plausibly we should have something like this, but even this would need to be applied carefully - as different organisations are bottlenecked by different things.
  • Do you have heuristics that help you compare different community building outcomes: Yes. These are different between our programs, as it depends on how a program is attempting to help. E.g., in virtual programs admissions, we aren't able to applicants on outcomes, as for many participants, it is one of their first interactions with EA. As I mentioned above, I want us to increase transparency on this.

I also want to emphasise that an important component in our grantmaking is creating healthy intellectual scenes.

Hey Rob,

I wonder if filling out something like the template I laid out in this post could allow transparency without disclosing confidential details for the CEA group's team. 

In addition, if I were getting career-related information from a community builder, that community builder's future career prospects depended on getting people like me to choose a specific career path, and that fact was neither disclosed nor reasonably implied, I would feel misled by omission (at best).

By analogy, let's say I went to a military recruiter and talked to them at length about opportunities in various branches of the military. Even though they identified themselves as a generic military recruiter, they secretly only got credit for promotion if I decided to join the Navy. I would feel entitled to proactive disclosure of that information, and would feel misled if I got a pro-Navy pitch without such disclosure. 

(I am not saying I would feel misled if the community builder were evaluated on getting people to make EA career choices more broadly. I think it's pretty obvious that recruiting is part of the mission and that community builders may be evaluated on that. Likewise, I wouldn't feel misled if the military recruiter didn't tell me they were evaluated on how many people they recruited for the military as a whole.)

In addition, if I were getting career-related information from a community builder, that community builder's future career prospects depended on getting people like me to choose a specific career path, and that fact was neither disclosed nor reasonably implied, I would feel misled by omission (at best).

As far as I know, this is exactly what is happening. 

This is an excellent example of a powerful advantage EA/Non-profit has over non-EA/for-profit in that we all (in theory) have the same goal - to make the world a better place thus we can play a positive sum game, leveraging collaboration, coordination and reducing duplicating work.

Zero or negative-sum games are also IMO a broader societal problem that we need to put effort and work into fixing, so we should at least be cooperating with EA.

There is also something to be said for growing the pie or that the pie is already big enough to be shared with all, defo in the camp that EA should be a lot bigger and less elitist.

Joey - good post. Valid points.

As one example of an EA sub-community alienating another EA sub-community, I'm seeing a lot of recent posts about sexual misconduct within EA that tend to stereotype & demonize people who practice consensual non-monogamy (e.g. polyamory, open relationships) as if they're all sexual predators without any norms, boundaries, or ethics. As a sex researcher who studies anti-polyamory stigma, this seems fairly bad. Yet it's very hard to stand up for polyamory in these contexts, where any disagreement or pushback is perceived as invalidating someone's specific complaints about sexual misconduct. So, people into consensual non-monogamy may feel obligated to self-censor.

More generally, I'm seeing a lot of general misandry (anti-male bias) in some of these posts about sexual misconduct, as if all men are somehow complicit in the sexual misconduct of a few -- or, in extreme cases, as if all courtship and 'mating effort' by males is somehow ethically invalid. This risks alienating the silent majority of men in EA who don't like being demonized, but who are too frightened to protest against the casual misandry that has become all too common in online culture. 

I think it's important to remember that EA sub-communities include both female sub-communities and male sub-communities, and both monogamist and non-monogamist sub-communities, and we should try very hard to be equally welcoming, affirming, and validating to all of them.

Curious what downvoters are thinking. I disagree-voted, but upvoted to correct for thinking it was too low. 

I guess I try to reserve downvotes for either lazy or overton-violating comments, and I think the mistakes Geoffrey is making in this post are within a reasonable error tolerance (i.e., neither of those).

I didn't vote in any way, but I do think Geoffrey's point is somewhat afield of what I perceive as the main point of Joey's post -- which, as I read it, is largely about communication with those who are outside or new to the community who may "encounter sub-communities before they have a full understanding of the movement and the wide variety of opinions and viewpoints within it." Likewise, the discussion of (e.g.,) males as a sub-community is somewhat distant from what I think Joey is trying to convey here.

Given the loose fit between main post and comment, people may be reacting to the past tendency for discussions on the issue of polyamory to significantly take over a comment thread. A downvote may be a method of communicating a desire to keep the discussion more tightly linked to the main point rather than turning Joey's post into another thread on polyamory, alleged misandry in internal community discussions, or similar issues.

Jason, thanks for your hypothesis. 

I am struck yet again by a double standard here, where if some groups or subcommunities complain that EA is not inclusive, or is too polarizing, their views are taken very seriously, and discussed respectfully. Whereas if other groups or subcommunities (e.g. men who don't like misandry, or poly people) complain that EA is not inclusive, their views are dismissed as irrelevant distractions. 

Yeah there's nonzero truth to this. An example is that Hanania's attempts at contributing to EA dialogue get downvoted to oblivion on here (though this example is flawed: Hanania has a mean-spirited and unpleasant writing style that would generate downvotes regardless of the value of his analysis). I think it's basically fine to conclude that the median request for epistemic diversity around here is a thinly veiled complaint about the space not being lefty enough. 

Which isn't intrinsically the same thing as the question of which group's grievances are honored and which groups are told to get over it, but definitely correlated. 

I disagree with Geoffrey because I think on the forum (with a couple exceptions) that the "stereotype risk" or "risks from aggressive generalization" insofar as they may in theory negatively impact male readers are well within a reasonable error tolerance, I've seen really hardly anyone step over the line into what I'd call misandry, or even come close really. (but that may just be cuz I'm toughened up from my years adjacent to outwardly pro misandry parts of lefty cultures, lmao). 

Moreover I expect OP would reasonably find relitigating epistemic diversity a little off topic. Kinda related but very different emphasis. 

quinn - thanks for your reply. Valid points. 

I may be sensitized to the downsides of what one could call 'casual misandry' from my time hosting the Mating Grounds podcast (2014-2016), when we took calls from hundreds of young single men who felt unfairly attacked, demonized, & stereotyped by the current mainstream culture. So when I see discussions of sexual (mis)conduct issues in EA Forum, I may be perceiving more casual misandry and anti-male stereotyping than others might.

Well, I can't even see my own comment any more, and I have no idea why people are downvoting my plea for EAs to stop demonizing polyamorous people and men. Puzzling.

An anecdote I sometimes share: during my undergraduate college search, I experienced what you would call "polarizing techniques" at one university and their antithesis at another. I had previously attended a summer camp at a university in my home state; in my senior year of high school, they invited me back for a short seminar and proceeded to spend an hour talking about how wonderful they are, how privileged I would be to attend, how much of an honor it was to be invited to join [insert pithy university collective name]. They were, in fact, a decent school. They were also my backup option. Big fish, small pond. 

I attended a different university's program not long after. The program director's welcome speech, by contrast, said in essence "we want you here, we think we'd be good for you, but you should go to the school that will bring out the best in you; if you think that's not us, go elsewhere with our blessing." 

I attended the second school and never regretted it. While my decision was pretty overdetermined, the stark contrast between the pushy, snobbish diatribe at the first school and the encouraging, welcoming, confident-but-not-arrogant tone at the second was a definite influence. 

Respect your audience, and they respect you back. 

Curated and popular this week
Relevant opportunities