Last week, someone posted this comment on the Effective Altruism USC Instagram (we deleted it):

This question isn’t out of the ordinary for us at USC, and I’m sure other EA groups get asked the same thing.

In our opinion, our Instagram account might not be the most polished or sleek, but it certainly doesn’t give off any cult-like vibes. Other elements of EA might understandably be seen as cult-like (e.g. ubiquitous “retreats,” mysterious wealthy donors funding our dinners, debate about which year AI will kill everyone)--but not our Instagram, nor any other EA uni group’s Instagram that I’ve seen. 

The comment we received was on a post announcing an AI governance speaker event, and (to my knowledge) the commenter has not yet interacted with anyone on our leadership team–so it’s possible that this cult impression came solely from seeing our (relatively normal) online presence. 

So what gives? 

Here’s a hypothesis that could explain some cases of mistaken impressions (although obviously not all): when people look up “altruism,” they find potentially culty-looking things.

There’s already been plenty of discussion that the word “altruism” itself reminds people of other dogmatic “isms” (Positive Impact Society Erasmus wrote a post mentioning it, with a recent update). From personal experience, a lot of people don’t know what “altruism” means–so maybe they go to look it up on Google. And what do they find?

Even though this description is factually correct, it prominently mentions spirituality, tradition, and religion, which could plausibly seem cult-like for a movement that advertises ourselves as “doing good using evidence and reasoning”. People looking for a quick definition might briefly see those words and assume EA is some strange system of worship. The 18th-century oil painting probably doesn’t help. 

The second Google result I get is this: 

For someone who’s just been primed with the first definition, this might only raise more red flags. They’re about the “moral practice” of a “traditional virtue” and they want us to “promote others’ welfare” at our own cost?

I’m aware that this is an uncharitable reading of the Google search results. I’m trying to trace out potential paths that might lead someone from seeing the words “effective altruism” without interacting with us or knowing much about it, to thinking we’re a cult. If anyone has another plausible story for how this might happen, I’d love to hear it.

Some things we can do about it

  1. Give feedback on Google’s knowledge panel (the rectangular box on the right when you look something up), so that it’s more about helping others and less about religion/spirituality. I’m not sure what process Google uses to review feedback requests, so I don’t know if they’d be more likely to consider changes if multiple people submit the same comment–but it seems worth trying anyway. To give feedback, click the “Feedback” link at the bottom of the knowledge panel.
  2. Edit the Wikipedia page for altruism again. There’s already content about EA there, but the initial section of the article could be tweaked. It could also help to replace the main image with something else that seems more contemporary. (I haven’t done this yet, but I might try to).
  3. Explicitly define “altruism” when giving EA intro pitches, whether verbally or in writing. Most EA intro articles that I’m aware of (like this one) don’t do this–and people who aren’t familiar with the word might then Google it, leading to the cult impression. For the future, it’s probably low-cost to add a quick sentence giving a definition of altruism to any intro EA content, to clarify any potential misinterpretations. 
  4. Ask why people outside the EA community associate us with cultiness, to see whether this hypothesis even has merit. I’m not sure if any focus group-style studies have been done on the EA community’s image before, but maybe that’s worthwhile. (And if anyone knows of any, I’d be curious to read them).

16

Comments8
Sorted by Click to highlight new comments since: Today at 8:22 PM

One theory that I'm fond of, both because it has some explanatory power, and because unlike other theories about this with explanatory power, it is useful to keep in mind and not based as directly on misconceptions, goes like this:

-A social group that has a high cost of exit, can afford to raise the cost of staying. That is, if it would be very bad for you to leave a group you are part of, the group can more successfully pressure you to be more conformist, work harder in service of it, and tolerate weird hierarchies.

-What distinguishes a cult, or at least one of the most important things that distinguishes it, is that it is a social group that manually raises the cost of leaving, in order to also raise the cost of staying. For instance it relocates people, makes them cut off other relationships, etc.

-Effective Altruism does not manually raise the cost of leaving for this purpose, and neither have I seen it really raise the cost of staying. Even more than most social groups I have been part of, being critical of the movement, having ideas that run counter to central dogmas, and being heavily involved in other competing social groups, are all tolerated or even encouraged. However,

-The cost of leaving for many Effective Altruists is high, much of this self-inflicted. Effective Altruists like to live with other Effective Altruists, make mostly Effective Altruist close friends, enter romantic relationships with other Effective Altruists, work at Effective Altruist organizations, and believe idiosyncratic ideas mostly found within Effective Altruism. Some of this is out of a desire to do good, speaking from experience, much of it is because we are weirdos who are most comfortable hanging out with people who are similar types of weirdos to us, and have a hard time with social interactions in general. Therefore,

-People looking in sometimes see things from point four, the things that contribute to the high cost of leaving, and even if they can't put what's cultish about it into words, are worried about possible cultishness, and don't know the stuff in point three viscerally enough to be disuaded of this impression. Furthermore, even if EA isn't a cult, point four is still important, because it increases the risk of cultishness creeping up on us.

Overall, I'm not sure what to do with this. I guess be especially vigilant, and maybe work a little harder to have as much of a life as possible outside of Effective Altruism. Anyway, that's my take.

I've had quite a few people ask me "What's altruism?" when running university clubs fair stalls for EA Wellington.

Yeah, I've had several (non-exchange) students ask me what altruism means--my go-to answer is "selflessly helping others," which I hope makes it clear that it describes a practice rather than a dogma. 

We had that as well with EA USyd, but they were all security guards etc working on the campus, or some exchange students.

aogara
1y160

Hey Aman, thanks for the post. It does seem a bit outdated that the top picture for altruism  is a French painting from hundreds of years ago. EA should hope to change the cultural understanding of doing good from something that's primarily religious or spiritual, to something that can be much more scientific and well-informed. 

I do think some of the accusations of EA being a cult might go a bit deeper. There aren't many other college clubs that would ask you to donate 10% of your income or determine your career plans based on their principles. One community builder who'd heard similar accusations here traced the concerns to EA's rapid growth in popularity and a certain "all-or-nothing" attitude in membership. Here's another person who had some great recommendations for avoiding the accusation. I particularly liked the emphasis on giving object-level arguments rather than appealing to authority figures within EA. 

Overall, it seems tough for an ethical framework + social movement to avoid the accusation at times, but hopefully our outreach can be high quality enough to encourage a better perception. 

Thanks for the comment! I agree with your points--there are definitely elements of EA, whether they're core to EA or just cultural norms within the community, that bear stronger resemblances to cult characteristics. 

My main point in this post was to explore why someone who hasn't interacted with EA before (and might not be aware of most of the things you mentioned) might still get a cult impression. I didn't mean to claim that the Google search results for "altruism" are the most common reason why people come away with a cult impression. Rather, I think that they might explain a few perplexing cases of cult impressions that occur before people become more familiar with EA. I should have made this distinction clearer, thanks for pointing it out :)

A key characteristic of a cult is a single leader who accrues a large amount of trust and is held by themselves and others to be singularly insightful. The LW space gets like that sometimes, less so EA, but they are adjacent communities. 

Recently, Eliezer wrote

The ability to do new basic work noticing and fixing those flaws is the same ability as the ability to write this document before I published it, which nobody apparently did, despite my having had other things to do than write this up for the last five years or so.  Some of that silence may, possibly, optimistically, be due to nobody else in this field having the ability to write things comprehensibly - such that somebody out there had the knowledge to write all of this themselves, if they could only have written it up, but they couldn't write, so didn't try.  I'm not particularly hopeful of this turning out to be true in real life, but I suppose it's one possible place for a "positive model violation" (miracle).  The fact that, twenty-one years into my entering this death game, seven years into other EAs noticing the death game, and two years into even normies starting to notice the death game, it is still Eliezer Yudkowsky writing up this list, says that humanity still has only one gamepiece that can do that.  I knew I did not actually have the physical stamina to be a star researcher, I tried really really hard to replace myself before my health deteriorated further, and yet here I am writing this.  That's not what surviving worlds look like.

I don't necessarily disagree with this analysis, in fact, I have made similar observations myself. But the social dynamic of it all pattern-matches to cult-like, and I  think that's a warning sign we should be wary of as we move forward. In fact, I think we should probably have an ongoing community health initiative targeted specifically at monitoring signs of group-think and other forms of epistemic failure in the movement.

[anonymous]1y30

The real issue is really the fact that AI tends to be both the public facing side of EA, and one where there's a lot of existential claims that sound similar to cultish claims like "If AGI happens, we'll go extinct." We really need specific cause areas for new EAs to make it less a personal identity.