Hide table of contents

Is EA a cult?

Cultishness is a spectrum. As I’ll demonstrate, EA has some characteristics normally associated with cults, which can give both outsiders and those already engaged with EA a very bad impression. I'm not arguing though that the situation is so bad that EA is a cult in the extreme sense of the word. But I think that people who don’t know much about EA and get the impression that it’s a cult are not crazy to think that.

A narrative of EA as a cult

What if I told you about a group of people calling themselves Effective Altruists (EAs) who wish to spread their movement to every corner of the world. Members of this movement spend a lot of money and energy on converting more people into their ideology (they call it community building). They tend to target young people, especially university students (and increasingly high school students). They give away free books written by their gurus and run fellowship programs. And they will gladly pay all the expenses for newcomers to travel and attend their events so that they can expose them to their ideas.

EAs are encouraged to consult the doctrine and other movement members about most major life decisions, including what to do for a living (career consultation), how to spend their money (donation advice), and what to do with their spare time (volunteering). After joining the movement, EAs are encouraged to give away 10% or more of their income to support the movement and its projects. It’s not uncommon for EAs to socialize and live mostly with other EAs (some will get subsidized EA housing). Some EAs want even their romantic partners to be members of the community. 

While they tend to dismiss what’s considered common sense by normal mainstream society, EAs will easily embrace very weird-sounding ideas once endorsed by the movement and its leaders.

Many EAs believe that the world as we know it may soon come to an end and that humanity is under existential threat. They believe that most normal people are totally blind to the danger. EAs, on the other hand, have a special role in preventing the apocalypse, and only through incredible efforts can the world be saved. Many of these EAs describe their aspirations in utopian terms, declaring that an ideal world free of aging and death is waiting for us if we take the right actions. To save the world and navigate the future of humanity, EAs often talk about the need to influence governments and public opinion to match their beliefs.

It’s not just optics

While I’ve focused on how EA might be perceived from the outside, I think that many of the cult-like features of EA pose a real issue, so it’s not just a PR problem. I don’t yet have a good mental model of all the ways in which it plays out, but I believe there are other negative consequences. The impression of a cult could also explain why some of the recent media attention on EA hasn’t been very positive (I’m not saying it’s the only reason).

How to make EA less cultish

Many of the features of EA that make it look and sound like a cult (e.g. pursuit of growth, willingness to accept unconventional ideas) are quite essential to the project of doing the most good, so I’m not suggesting to automatically get rid of everything that could possibly be perceived as cultish (on the other hand, I think that how EA is perceived is quite important, so  we shouldn’t ignore these considerations either). Having said that, I believe there are cultural norms we could embrace that would push us away from cultishness without significantly compromising other goals.

Some helpful anti-clutishness norms that are already established in EA to some extent and I’d like to see further cultivated include:

Other norms that I’d like to see include:

  • Advertising EA too aggressively can be counterproductive (even though we want EA to grow).
  • Having one’s whole life revolved around EA should be considered unhealthy.
  • Being known as the person who only talks about EA might be a red flag.
  • Going against social norms is not a virtue; it’s a price we sometimes have to pay for doing the right thing.
  • Moderation is a virtue.
  • Mixing work interactions and intimate relationships (such as sex or residence) shouldn't be taken lightly.
  • Conflicts of interests should be taken seriously.
  • It’s important to seriously consider what non-EA people and organizations have to say, even when they don’t think and communicate in EA’s preferred style (and it might be tempting to dismiss as grounded in bad epistemics).

Others have also written about different aspects of the problem and potential solutions (see for example 1234).

Summary

I don’t think that EA is a cult in the worst sense of the word, but it seems to have many cult-like features that easily give a bad impression (mainly to outsiders but not only). There are cultural norms and attitudes that, if cultivated, could make EA less cultish. 

Acknowledgements

I want to thank Edo AradGidon Kadosh and Sella Nevo for their feedback. Insofar as this post still sucks, it’s entirely my fault.

73

0
0

Reactions

0
0

More posts like this

Comments19
Sorted by Click to highlight new comments since: Today at 11:17 AM

As a newcomer to EA, and a person with a fair amount of experience of cults and cult-like groups (and I'm 78 years old), I would like to report my experience.

I am very attracted to the ideas expressed in Doing Good Better. Having a science background, the idea of analyzing how effective my philanthropy may be is something I have pursued for many years, leading to many of the same conclusions.

On the other hand, many of the ideas of longtermism, trying to save human beings thousands of years in the future, being concerned about spreading to other planets, seeing malevolent AGI as among the most critical issues to address, strike me as similar to cults like Scientology and other groups whose vision and ideas seem contrary to common sense (if not downright wacky) but which seems to be common currency if not required by "members."

In What We Owe the Future, MacAskill often expresses reservations about his ideas, points out alternatives or potential flaws, and in general shows somewhat more humility that I encounter on this Forum, for example. I certainly disagree with some of his conclusions and approaches, which I have begun to attempt to express in my few posts here to date, but I do respect his and others' efforts to think long-term when accompanied by express recognition of our limitations in trying to impact the future (except in set-in-stone certainties) more than a couple of decades out. Without those ongoing acknowledgments of our limitations, our uncertainty, and the weirdness of our perspectives (from a "normal" viewpoint), we are bound to come across as potentially cult-like.

pete
2y41
30
2

It’s also important to note that with many (I think 86% at last survey) EAs being nonreligious, it can be relatively easy for EA to play that role in people’s lives. The cultural template is there, and it’s powerful.

Great point. The decline of religion has arguably left a cultural vacuum that new organizations can fill.

[comment deleted]2y0
0
0
TobyW
2y26
19
2

Needed to be said. I'm someone who gravitates to a lot of EA ideas, but I've avoided identifying as "an EA" for just this reason. Recently went to an EAG, which quelled some of my discomfort with EA's cultishness, but I think there's major room for improvement.

My lightly held hypothesis is that the biggest reason for this is EA's insularity. I think that using broader means of communication (publishing in journals and magazines, rather that just the EA forum) would go a really long way to enabling people to be merely inspired by EA, rather than "EAs" themselves. I like EA as a set of ideas and a question, not so much as a lifestyle and an all-consuming community. People should be able to publicly engage with (and cite!) EA rhetoric without having to hang out on a particular forum or have read the EA canon.

Interestingly we both posted this on the same day but I have almost the entire opposite approach! Would appreciate your thoughts!

https://forum.effectivealtruism.org/posts/3Jm6tK3cfMyaan5Dn/ea-is-not-religious-enough-ea-should-emulate-peak-quakerism

Yes the timing was funny (it looks like virtually everyone on the forum has something to say about EA culture nowadays :P)
I commented in your post.

I'd find this post much more valuable if it argued that some parts of the EA community were bad, rather than arguing that they're cultish. Cultish is an imperfect proxy for badness. Sure, cults are bad and something being a thing which cults do is weak evidence of its badness (see Reversed Stupidity Is Not Intelligence). Is, say, advertising EA too aggressively bad? Probably! But it is bad for specific reasons, not because it is also a thing cults do.

A particular way cultishness could be bad, which would make it directly bad for EA to be cultish, is if cults are an attractor in the space of organizations. This would mean that organizations with some properties of cults would feel pressure to gain more and more properties of cults. Still, I currently don't think is the case, and so I think direct criticisms are much more valuable than insinuations of cultishness.

I think a great way to prevent us from turning into a cult is listening to criticism and have a diversity and opinions.

I would say EAs for the most part are open to criticism but there are EAs who will unintentionally partake in fallacious reasoning when their EA cause or EA opinion is attacked.

MacAskill once said EAs for the most part are social liberals. This is understandable considering most EAs come from fortunate backgrounds and have a college education.

Matt Yglesias and Tyler Cowen noted in their fireside chats at EA Global that their is a sense of homogeneity and a common group. Tyler Cowen said most EAs are “coastal elites.” I wouldn’t use the term coastal, but most EAs definitely come from elite families, elite colleges, elite company, or some other form of elite group.

There’s nothing wrong with being a social liberal (I am one), college educated, wealthy, or elite, but it could create an echo chamber and result in a cult or something cult-like.

I would like to see more libertarians, conservatives, and non-elites in EA so we can get different viewpoints, critiques, and a diversity of thought.

I think that the "A narrative of EA as a cult" section is helpful for steelmanning this narrative/perception. I also appreciate your suggestions and ideas in the "How to make EA less cultish" section.

As far as I can see, you don't explore any substantive reasons or evidence why "the cult-like features of EA pose a real issue" beyond optics; you note that "the impression of a cult could also explain why some of the recent media attention on EA hasn’t been very positive", but this is about optics. So I'd be interested to hear/read you try to flesh out the "other negative consequences" that you allude to. 

The easier option is to remove "(and it's not just optics)" from the title and rename "It’s not just optics" to "It might be more than optics" or similar, but if you do have thoughts on the other negative consequences, these could be valuable to share.

(For context, I'm one of the people involved in reaching out to high school students, and I'm keen to understand the full implications -- pros and cons -- of doing so, and if there's anything we can do to mitigate the downsides while retaining the benefits.)

Thanks for this helpful post. Strong upvoted.

Thank you for this comment!

You are absolutely right. I didn't really explore any consequences of EA being cultish other than optics. As I said in the post, I don't really have a good mental model of all the ways in which it plays out, but I do have a strong intuitive sense that it does have other bad consequences (honestly, this entire post is based on intuitions and anecdotal evidence - none of my claims are based on rigorous studies).

Having said that, here's a very partial list of other consequences that I believe exist:

1. Making people with different levels of engagement with EA feel uncomfortable (you could say it's also just optics, but I think they have good reasons to feel uncomfortable).

2. Bad epistemics, groupthink and echo chamber effects (I develop this idea a bit further here).

3. Not engaging enough with people and opinions outside EA.

4. Attracting mostly very specific types of people (again, maybe this could be labeled as optics).

5. Radical beliefs.

And just to clarify (I know that you know that, but just for the record) - I'm not saying that outreach to children is necessarily a bad idea. It has many pros and cons that should be weighed somehow. I hope that my post has been helpful in describing some potential risks. 

Something I think is really valuable is being upfront about mistakes and uncertainties.  I really admire CEA's mistakes page, for example. Cults often try to portray themselves and their leaders as infallible. Whereas admitting mistakes helps dispel that illusion and encourage critical thinking and a diversity of ideas. 

I'm curious whether the reason why EA may be perceived as a cult while, e.g., environmentalist and social justice activism are not, is primarily that the concerns of EA are much less mainstream.

I appreciate the suggestions on how to make EA less cultish, and I think they are valuable to implement, but I don't think they would have a significant effect on public perception of whether EA is a cult.

Interesting post, and some valid points.

I would also add: cults tend to micro-manage the sexual relationships and reproductive strategies of their members. 

Sometimes this involves minimizing sexual activity, so cult members direct all of their energy and time into cult-propagation rather than mating effort. Sometimes it involves maximizing sexual connections or endogamous marriages within the cult, so people don't feel any tension between their relationship commitments and their cult commitments.

Sometimes cults are anti-natalist and strongly discourage reproduction in order to maximize energy and time directed into cultural cult-propagation (i.e. 'horizontal cultural transmission'). Sometimes they're pro-natalist and strongly encourage reproduction in order to create new recruits for the next generation (i.e. 'vertical cultural transmission'). 

An implication is that the more 'normal' EA seems in terms of relationship formation (e.g. a nice mix of 'cultural inbreeding' within the group and outbreeding outside the group), and family formation (e.g. people having kids, but not crazy numbers of kids), the less cult-like we'll seem.

Maybe hiring nonEAs for certain roles (like "communications assistant" and not like "board member") could improve communications/appearances/maybe outreach?

I am not concerned too much with EA turning into a cult, for one reason:

Cults/New Religious Movements are vastly less bad than most people think, and the literature on cults repudiates a lot of claims that the general population believes on cults, especially anything to do with harm.

Link to it here:

https://www.lesswrong.com/posts/TiG8cLkBRW4QgsfrR/notes-on-brainwashing-and-cults

pete
2y20
9
0

Replying to this because I don't think this is a rare view, and I'm concerned about it. Met someone this week who  seemed to openly view cults as a template (flawed, but useful) and was in the process of building a large compound overseas where he could disseminate his beliefs to followers who lived onsite. By his own admission, he was using EA as a platform to launch multiple(?) new genuine religions. 

In light of the Leverage Research incident, we should expect and keep an eye out for folks using the EA umbrella to actually start cults.

My point is that contra the narrative in this post, cults are vastly less bad than the general public believes, so much so that the post is responding to a straw problem. I don't necessarily agree with the beliefs of the New Religious Movements/cults but the cult literature shows that they are vastly less bad then the general public thinks.

I know it's a counterintuitive truth, but I want people to understand that the general public believing something is bad does not equal badness.

[anonymous]2y5
6
0

I skimmed the link and it seems to be mostly about brainwashing not being effective. But cults do a lot of damage besides brainwashing. The insight that cults do provide some value to their members (otherwise why would anyone join?) is true, but does not mean that they don't do a lot of net harm. 

More from nadavb
Curated and popular this week
Relevant opportunities