Hide table of contents

Is EA a cult?

Cultishness is a spectrum. As I’ll demonstrate, EA has some characteristics normally associated with cults, which can give both outsiders and those already engaged with EA a very bad impression. I'm not arguing though that the situation is so bad that EA is a cult in the extreme sense of the word. But I think that people who don’t know much about EA and get the impression that it’s a cult are not crazy to think that.

A narrative of EA as a cult

What if I told you about a group of people calling themselves Effective Altruists (EAs) who wish to spread their movement to every corner of the world. Members of this movement spend a lot of money and energy on converting more people into their ideology (they call it community building). They tend to target young people, especially university students (and increasingly high school students). They give away free books written by their gurus and run fellowship programs. And they will gladly pay all the expenses for newcomers to travel and attend their events so that they can expose them to their ideas.

EAs are encouraged to consult the doctrine and other movement members about most major life decisions, including what to do for a living (career consultation), how to spend their money (donation advice), and what to do with their spare time (volunteering). After joining the movement, EAs are encouraged to give away 10% or more of their income to support the movement and its projects. It’s not uncommon for EAs to socialize and live mostly with other EAs (some will get subsidized EA housing). Some EAs want even their romantic partners to be members of the community. 

While they tend to dismiss what’s considered common sense by normal mainstream society, EAs will easily embrace very weird-sounding ideas once endorsed by the movement and its leaders.

Many EAs believe that the world as we know it may soon come to an end and that humanity is under existential threat. They believe that most normal people are totally blind to the danger. EAs, on the other hand, have a special role in preventing the apocalypse, and only through incredible efforts can the world be saved. Many of these EAs describe their aspirations in utopian terms, declaring that an ideal world free of aging and death is waiting for us if we take the right actions. To save the world and navigate the future of humanity, EAs often talk about the need to influence governments and public opinion to match their beliefs.

It’s not just optics

While I’ve focused on how EA might be perceived from the outside, I think that many of the cult-like features of EA pose a real issue, so it’s not just a PR problem. I don’t yet have a good mental model of all the ways in which it plays out, but I believe there are other negative consequences. The impression of a cult could also explain why some of the recent media attention on EA hasn’t been very positive (I’m not saying it’s the only reason).

How to make EA less cultish

Many of the features of EA that make it look and sound like a cult (e.g. pursuit of growth, willingness to accept unconventional ideas) are quite essential to the project of doing the most good, so I’m not suggesting to automatically get rid of everything that could possibly be perceived as cultish (on the other hand, I think that how EA is perceived is quite important, so  we shouldn’t ignore these considerations either). Having said that, I believe there are cultural norms we could embrace that would push us away from cultishness without significantly compromising other goals.

Some helpful anti-clutishness norms that are already established in EA to some extent and I’d like to see further cultivated include:

Other norms that I’d like to see include:

  • Advertising EA too aggressively can be counterproductive (even though we want EA to grow).
  • Having one’s whole life revolved around EA should be considered unhealthy.
  • Being known as the person who only talks about EA might be a red flag.
  • Going against social norms is not a virtue; it’s a price we sometimes have to pay for doing the right thing.
  • Moderation is a virtue.
  • Mixing work interactions and intimate relationships (such as sex or residence) shouldn't be taken lightly.
  • Conflicts of interests should be taken seriously.
  • It’s important to seriously consider what non-EA people and organizations have to say, even when they don’t think and communicate in EA’s preferred style (and it might be tempting to dismiss as grounded in bad epistemics).

Others have also written about different aspects of the problem and potential solutions (see for example 1234).

Summary

I don’t think that EA is a cult in the worst sense of the word, but it seems to have many cult-like features that easily give a bad impression (mainly to outsiders but not only). There are cultural norms and attitudes that, if cultivated, could make EA less cultish. 

Acknowledgements

I want to thank Edo AradGidon Kadosh and Sella Nevo for their feedback. Insofar as this post still sucks, it’s entirely my fault.

Comments19


Sorted by Click to highlight new comments since:

As a newcomer to EA, and a person with a fair amount of experience of cults and cult-like groups (and I'm 78 years old), I would like to report my experience.

I am very attracted to the ideas expressed in Doing Good Better. Having a science background, the idea of analyzing how effective my philanthropy may be is something I have pursued for many years, leading to many of the same conclusions.

On the other hand, many of the ideas of longtermism, trying to save human beings thousands of years in the future, being concerned about spreading to other planets, seeing malevolent AGI as among the most critical issues to address, strike me as similar to cults like Scientology and other groups whose vision and ideas seem contrary to common sense (if not downright wacky) but which seems to be common currency if not required by "members."

In What We Owe the Future, MacAskill often expresses reservations about his ideas, points out alternatives or potential flaws, and in general shows somewhat more humility that I encounter on this Forum, for example. I certainly disagree with some of his conclusions and approaches, which I have begun to attempt to express in my few posts here to date, but I do respect his and others' efforts to think long-term when accompanied by express recognition of our limitations in trying to impact the future (except in set-in-stone certainties) more than a couple of decades out. Without those ongoing acknowledgments of our limitations, our uncertainty, and the weirdness of our perspectives (from a "normal" viewpoint), we are bound to come across as potentially cult-like.

pete
43
32
2

It’s also important to note that with many (I think 86% at last survey) EAs being nonreligious, it can be relatively easy for EA to play that role in people’s lives. The cultural template is there, and it’s powerful.

Great point. The decline of religion has arguably left a cultural vacuum that new organizations can fill.

[comment deleted]0
0
0
TobyW
26
19
2

Needed to be said. I'm someone who gravitates to a lot of EA ideas, but I've avoided identifying as "an EA" for just this reason. Recently went to an EAG, which quelled some of my discomfort with EA's cultishness, but I think there's major room for improvement.

My lightly held hypothesis is that the biggest reason for this is EA's insularity. I think that using broader means of communication (publishing in journals and magazines, rather that just the EA forum) would go a really long way to enabling people to be merely inspired by EA, rather than "EAs" themselves. I like EA as a set of ideas and a question, not so much as a lifestyle and an all-consuming community. People should be able to publicly engage with (and cite!) EA rhetoric without having to hang out on a particular forum or have read the EA canon.

Interestingly we both posted this on the same day but I have almost the entire opposite approach! Would appreciate your thoughts!

https://forum.effectivealtruism.org/posts/3Jm6tK3cfMyaan5Dn/ea-is-not-religious-enough-ea-should-emulate-peak-quakerism

Yes the timing was funny (it looks like virtually everyone on the forum has something to say about EA culture nowadays :P)
I commented in your post.

I'd find this post much more valuable if it argued that some parts of the EA community were bad, rather than arguing that they're cultish. Cultish is an imperfect proxy for badness. Sure, cults are bad and something being a thing which cults do is weak evidence of its badness (see Reversed Stupidity Is Not Intelligence). Is, say, advertising EA too aggressively bad? Probably! But it is bad for specific reasons, not because it is also a thing cults do.

A particular way cultishness could be bad, which would make it directly bad for EA to be cultish, is if cults are an attractor in the space of organizations. This would mean that organizations with some properties of cults would feel pressure to gain more and more properties of cults. Still, I currently don't think is the case, and so I think direct criticisms are much more valuable than insinuations of cultishness.

I think a great way to prevent us from turning into a cult is listening to criticism and have a diversity and opinions.

I would say EAs for the most part are open to criticism but there are EAs who will unintentionally partake in fallacious reasoning when their EA cause or EA opinion is attacked.

MacAskill once said EAs for the most part are social liberals. This is understandable considering most EAs come from fortunate backgrounds and have a college education.

Matt Yglesias and Tyler Cowen noted in their fireside chats at EA Global that their is a sense of homogeneity and a common group. Tyler Cowen said most EAs are “coastal elites.” I wouldn’t use the term coastal, but most EAs definitely come from elite families, elite colleges, elite company, or some other form of elite group.

There’s nothing wrong with being a social liberal (I am one), college educated, wealthy, or elite, but it could create an echo chamber and result in a cult or something cult-like.

I would like to see more libertarians, conservatives, and non-elites in EA so we can get different viewpoints, critiques, and a diversity of thought.

I think that the "A narrative of EA as a cult" section is helpful for steelmanning this narrative/perception. I also appreciate your suggestions and ideas in the "How to make EA less cultish" section.

As far as I can see, you don't explore any substantive reasons or evidence why "the cult-like features of EA pose a real issue" beyond optics; you note that "the impression of a cult could also explain why some of the recent media attention on EA hasn’t been very positive", but this is about optics. So I'd be interested to hear/read you try to flesh out the "other negative consequences" that you allude to. 

The easier option is to remove "(and it's not just optics)" from the title and rename "It’s not just optics" to "It might be more than optics" or similar, but if you do have thoughts on the other negative consequences, these could be valuable to share.

(For context, I'm one of the people involved in reaching out to high school students, and I'm keen to understand the full implications -- pros and cons -- of doing so, and if there's anything we can do to mitigate the downsides while retaining the benefits.)

Thanks for this helpful post. Strong upvoted.

Thank you for this comment!

You are absolutely right. I didn't really explore any consequences of EA being cultish other than optics. As I said in the post, I don't really have a good mental model of all the ways in which it plays out, but I do have a strong intuitive sense that it does have other bad consequences (honestly, this entire post is based on intuitions and anecdotal evidence - none of my claims are based on rigorous studies).

Having said that, here's a very partial list of other consequences that I believe exist:

1. Making people with different levels of engagement with EA feel uncomfortable (you could say it's also just optics, but I think they have good reasons to feel uncomfortable).

2. Bad epistemics, groupthink and echo chamber effects (I develop this idea a bit further here).

3. Not engaging enough with people and opinions outside EA.

4. Attracting mostly very specific types of people (again, maybe this could be labeled as optics).

5. Radical beliefs.

And just to clarify (I know that you know that, but just for the record) - I'm not saying that outreach to children is necessarily a bad idea. It has many pros and cons that should be weighed somehow. I hope that my post has been helpful in describing some potential risks. 

Something I think is really valuable is being upfront about mistakes and uncertainties.  I really admire CEA's mistakes page, for example. Cults often try to portray themselves and their leaders as infallible. Whereas admitting mistakes helps dispel that illusion and encourage critical thinking and a diversity of ideas. 

I'm curious whether the reason why EA may be perceived as a cult while, e.g., environmentalist and social justice activism are not, is primarily that the concerns of EA are much less mainstream.

I appreciate the suggestions on how to make EA less cultish, and I think they are valuable to implement, but I don't think they would have a significant effect on public perception of whether EA is a cult.

Interesting post, and some valid points.

I would also add: cults tend to micro-manage the sexual relationships and reproductive strategies of their members. 

Sometimes this involves minimizing sexual activity, so cult members direct all of their energy and time into cult-propagation rather than mating effort. Sometimes it involves maximizing sexual connections or endogamous marriages within the cult, so people don't feel any tension between their relationship commitments and their cult commitments.

Sometimes cults are anti-natalist and strongly discourage reproduction in order to maximize energy and time directed into cultural cult-propagation (i.e. 'horizontal cultural transmission'). Sometimes they're pro-natalist and strongly encourage reproduction in order to create new recruits for the next generation (i.e. 'vertical cultural transmission'). 

An implication is that the more 'normal' EA seems in terms of relationship formation (e.g. a nice mix of 'cultural inbreeding' within the group and outbreeding outside the group), and family formation (e.g. people having kids, but not crazy numbers of kids), the less cult-like we'll seem.

Maybe hiring nonEAs for certain roles (like "communications assistant" and not like "board member") could improve communications/appearances/maybe outreach?

I am not concerned too much with EA turning into a cult, for one reason:

Cults/New Religious Movements are vastly less bad than most people think, and the literature on cults repudiates a lot of claims that the general population believes on cults, especially anything to do with harm.

Link to it here:

https://www.lesswrong.com/posts/TiG8cLkBRW4QgsfrR/notes-on-brainwashing-and-cults

pete
20
9
0

Replying to this because I don't think this is a rare view, and I'm concerned about it. Met someone this week who  seemed to openly view cults as a template (flawed, but useful) and was in the process of building a large compound overseas where he could disseminate his beliefs to followers who lived onsite. By his own admission, he was using EA as a platform to launch multiple(?) new genuine religions. 

In light of the Leverage Research incident, we should expect and keep an eye out for folks using the EA umbrella to actually start cults.

My point is that contra the narrative in this post, cults are vastly less bad than the general public believes, so much so that the post is responding to a straw problem. I don't necessarily agree with the beliefs of the New Religious Movements/cults but the cult literature shows that they are vastly less bad then the general public thinks.

I know it's a counterintuitive truth, but I want people to understand that the general public believing something is bad does not equal badness.

[anonymous]5
6
0

I skimmed the link and it seems to be mostly about brainwashing not being effective. But cults do a lot of damage besides brainwashing. The insight that cults do provide some value to their members (otherwise why would anyone join?) is true, but does not mean that they don't do a lot of net harm. 

More from nadavb
144
nadavb
· · 16m read
Curated and popular this week
LintzA
 ·  · 15m read
 · 
Cross-posted to Lesswrong Introduction Several developments over the past few months should cause you to re-evaluate what you are doing. These include: 1. Updates toward short timelines 2. The Trump presidency 3. The o1 (inference-time compute scaling) paradigm 4. Deepseek 5. Stargate/AI datacenter spending 6. Increased internal deployment 7. Absence of AI x-risk/safety considerations in mainstream AI discourse Taken together, these are enough to render many existing AI governance strategies obsolete (and probably some technical safety strategies too). There's a good chance we're entering crunch time and that should absolutely affect your theory of change and what you plan to work on. In this piece I try to give a quick summary of these developments and think through the broader implications these have for AI safety. At the end of the piece I give some quick initial thoughts on how these developments affect what safety-concerned folks should be prioritizing. These are early days and I expect many of my takes will shift, look forward to discussing in the comments!  Implications of recent developments Updates toward short timelines There’s general agreement that timelines are likely to be far shorter than most expected. Both Sam Altman and Dario Amodei have recently said they expect AGI within the next 3 years. Anecdotally, nearly everyone I know or have heard of who was expecting longer timelines has updated significantly toward short timelines (<5 years). E.g. Ajeya’s median estimate is that 99% of fully-remote jobs will be automatable in roughly 6-8 years, 5+ years earlier than her 2023 estimate. On a quick look, prediction markets seem to have shifted to short timelines (e.g. Metaculus[1] & Manifold appear to have roughly 2030 median timelines to AGI, though haven’t moved dramatically in recent months). We’ve consistently seen performance on benchmarks far exceed what most predicted. Most recently, Epoch was surprised to see OpenAI’s o3 model achi
Dr Kassim
 ·  · 4m read
 · 
Hey everyone, I’ve been going through the EA Introductory Program, and I have to admit some of these ideas make sense, but others leave me with more questions than answers. I’m trying to wrap my head around certain core EA principles, and the more I think about them, the more I wonder: Am I misunderstanding, or are there blind spots in EA’s approach? I’d really love to hear what others think. Maybe you can help me clarify some of my doubts. Or maybe you share the same reservations? Let’s talk. Cause Prioritization. Does It Ignore Political and Social Reality? EA focuses on doing the most good per dollar, which makes sense in theory. But does it hold up when you apply it to real world contexts especially in countries like Uganda? Take malaria prevention. It’s a top EA cause because it’s highly cost effective $5,000 can save a life through bed nets (GiveWell, 2023). But what happens when government corruption or instability disrupts these programs? The Global Fund scandal in Uganda saw $1.6 million in malaria aid mismanaged (Global Fund Audit Report, 2016). If money isn’t reaching the people it’s meant to help, is it really the best use of resources? And what about leadership changes? Policies shift unpredictably here. A national animal welfare initiative I supported lost momentum when political priorities changed. How does EA factor in these uncertainties when prioritizing causes? It feels like EA assumes a stable world where money always achieves the intended impact. But what if that’s not the world we live in? Long termism. A Luxury When the Present Is in Crisis? I get why long termists argue that future people matter. But should we really prioritize them over people suffering today? Long termism tells us that existential risks like AI could wipe out trillions of future lives. But in Uganda, we’re losing lives now—1,500+ die from rabies annually (WHO, 2021), and 41% of children suffer from stunting due to malnutrition (UNICEF, 2022). These are preventable d
Rory Fenton
 ·  · 6m read
 · 
Cross-posted from my blog. Contrary to my carefully crafted brand as a weak nerd, I go to a local CrossFit gym a few times a week. Every year, the gym raises funds for a scholarship for teens from lower-income families to attend their summer camp program. I don’t know how many Crossfit-interested low-income teens there are in my small town, but I’ll guess there are perhaps 2 of them who would benefit from the scholarship. After all, CrossFit is pretty niche, and the town is small. Helping youngsters get swole in the Pacific Northwest is not exactly as cost-effective as preventing malaria in Malawi. But I notice I feel drawn to supporting the scholarship anyway. Every time it pops in my head I think, “My money could fully solve this problem”. The camp only costs a few hundred dollars per kid and if there are just 2 kids who need support, I could give $500 and there would no longer be teenagers in my town who want to go to a CrossFit summer camp but can’t. Thanks to me, the hero, this problem would be entirely solved. 100%. That is not how most nonprofit work feels to me. You are only ever making small dents in important problems I want to work on big problems. Global poverty. Malaria. Everyone not suddenly dying. But if I’m honest, what I really want is to solve those problems. Me, personally, solve them. This is a continued source of frustration and sadness because I absolutely cannot solve those problems. Consider what else my $500 CrossFit scholarship might do: * I want to save lives, and USAID suddenly stops giving $7 billion a year to PEPFAR. So I give $500 to the Rapid Response Fund. My donation solves 0.000001% of the problem and I feel like I have failed. * I want to solve climate change, and getting to net zero will require stopping or removing emissions of 1,500 billion tons of carbon dioxide. I give $500 to a policy nonprofit that reduces emissions, in expectation, by 50 tons. My donation solves 0.000000003% of the problem and I feel like I have f
Relevant opportunities