People doing meta-EA sometimes jokingly frame their work as “I want to indoctrinate people into EA, sort of like what cults do, but don’t worry! Haha! What we do is fine because EA isn’t a cult.”

I think this is a harmful mindset. When thinking about how to get more people to be EA-aligned, I think skill mastery is a better model than cult indoctrination. 

Skill mastery often has these attributes: 

  • Repeated deliberate practice.
  • “Thinking about it in the shower”, i.e. thinking about it without much effort.
  • “Gears-level understanding”, i.e. knowing the foundations of the skill and understanding how all the pieces relate.

Cult indoctrination is more about gaslighting people into believing that there can be no other truth than X. They do this by repeating talking points and creating closed social circles. 

Accordingly, when thinking about how to get more people to be EA-aligned, here are some good questions to ask: 

  • Can we build structures which enable repeated deliberate practice of EA? Good examples are intro fellowships, book recommendations, and club meetings. Are there more?
  • Can we get people to “think about EA in the shower”? One way to improve this could be to provide better-written reading materials which pose questions which are amenable to shower thoughts. 
  • Can we encourage more “gears-level understanding” of EA concepts? For example, emphasize the reasons behind x-risk calculations rather than their conclusions. 

It is also probably a bad idea for EA to resemble a cult, because cults have bad epistemics. Accordingly, here are some paths to avoid going down: 

  • Repeating talking points: when discussing EA topics with a skeptical non-EA, don’t repeat standard EA talking points if they’re not resonating. It is useless to say “AI Safety is a pressing problem because superintelligent AGI may pose existential risk” if they do not believe superintelligent AGI could ever possibly exist. Instead, you can have a more intellectually honest conversation by first understanding what their current worldview and model of AI is, and building off of this. In other words it is important to adopt good pedagogy: building from the student's foundations, rather than instructing them to memorize isolated facts.
  • Closed social circles: for example, in the setting of a university group, it is probably a bad idea to create an atmosphere where people new to EA feel out of place. 

The central idea here is that promoting gears-level understanding of EA concepts is important. Gears-level understanding often has repeated deliberate practice and shower thoughts as a prerequisite, so skill mastery and gears-level understanding are closely related goals.

I would rather live in a world with people who have their own sound models of x-risk and other pressing problems, even if they substantially differ from the standard EA viewpoint, than a world of people who are fully on board with the standard EA viewpoints but don’t have a complete mastery of the ideas behind them.

Summary: People who try to get more people to be EA-aligned often use techniques associated with cult indoctrination, such as repeating talking points and creating closed social circles. Instead, I think it is more useful to think about EA-alignment as a skill that a person can master. Accordingly, good techniques to employ are repeated deliberate practice, "thinking about it in the shower", and promoting gears-level understanding. 

110

0
0

Reactions

0
0

More posts like this

Comments8
Sorted by Click to highlight new comments since: Today at 2:39 PM
Mau
2y26
0
0

Thanks! Seems like a useful perspective. I'll pick on the one bit I found unintuitive:

Summary: People who try to get more people to be EA-aligned often use techniques associated with cult indoctrination, such as repeating talking points and creating closed social circles.

In the spirit of not repeating talking points, could you back up this claim, if you meant it literally? This would be big if true, so I want to flag that:

  • You state this in the summary, but as far as I can see you don't state/defend it anywhere else in the post. So people just reading the summary might overestimate the extent to which the post argues for this claim.
  • I've seen lots of relevant community building, and I more often see the opposite: people being such nerds that they can't help themselves from descending into friendly debate, people being sufficiently self-aware that they know their unintuitive/unconventional views won't convince people if they're not argued for, and people pouring many hours into running programs and events (e.g. dinners, intro fellowships, and intro-level social events) aimed at creating an open social environment.

(As an aside, people might find it interesting to briefly check out YouTube videos of actual modern cult tactics for comparison.)

When I say "repeating talking points", I am thinking of: 

  1. Using cached phrases and not explaining where they come from. 
  2. Conversations which go like
    • EA: We need to think about expanding our moral circle, because animals may be morally relevant. 
    • Non-EA: I don't think animals are morally relevant though.
    • EA: OK, but if animals are morally relevant, then quadrillions of lives are at stake.

(2) is kind of a caricature as written, but I have witnessed conversations like these in EA spaces. 

My evidence for this claim comes form my personal experience watching EAs talk to non-EAs, and listen to non-EAs talk about their perception of EA. The total number of data points in this pool is ~20. I would say that I don't have exceptionally many EA contacts, compared to most EAs, but I do particularly make an effort to seek out social spaces where non-EAs are looking to learn about EA. Thinking back on these experiences, and what conversations went well and what ones didn't, is what inspired me to write this short post.

Ultimately my anecdotal data can't make any statistical statements about the EA community at large. The purpose of this post is to more describe two mental models of EA alignment and advocate for the "skill mastery" perspective. 

Mau
2y15
0
0

I think both (1) and (2) are sufficiently mild/non-nefarious versions of "repeating talking points" that they're very different from what people might imagine when they hear "techniques associated with cult indoctrination"--different enough that the latter phrase seems misleading.

(E.g., at least to my ears, the original phrase suggests that the communication techniques you've seen involve intentional manipulation and are rare; in contrast, (1) and (2) sound to me like very commonplace forms of ineffective (rather than intentionally manipulative) communication.)

(As I mentioned, I'm sympathetic to the broader purpose of the post, and my comment is just picking on that one phrase; I agree with and appreciate your points that communication along the lines of (1) and (2) happen, that they can be examples of poor communication / of not building from where others are coming from, and that the "skill mastery" perspective could help with this.)

Many domains that people tend to conceptualize as "skill mastery, not cult indoctrination" also have some cult-like properties like having a charismatic teacher, not being able to question authority (or at least, not being encouraged to think for oneself), and a social environment where it seems like other students unquestioningly accept the teachings. I've personally experienced some of this stuff in martial arts practice, math culture, and music lessons, though I wouldn't call any of those a cult.

Two points this comparison brings up for me:

  • EA seems unusually good compared to these "skill mastery" domains in repeatedly telling people "yes, you should think for yourself and come to your own conclusions", even at the introductory levels, and also just generally being open to discussions like "is EA a cult?".
  • I'm worried this post will be condensed into people's minds as something like "just conceptualize EA as a skill instead of this cult-like thing". But if even skill-like things have cult-like elements, maybe that condensed version won't help people make EA less cult-like. Or maybe it's actually okay for EA to have some cult-like elements!

Hi :) I'm surprised by this post. Doing full-time community building myself, I have a really hard time imagining that any group (or sensible individual) would use these 'cult indoctrination techniques' as strategies to get other people interested in EA.

Was wondering if you could share anything more about specific examples / communities where you have found this happening? I'd find that helpful for knowing how to relate to this content as a community builder myself! :-) 


(To be clear, I could imagine repeating talking points and closed social circles happening as side effects of other things - more specifically of individuals often not being that good at following what a good argument is and therefore repeating something that seems salient to them, and of people naturally creating social circles with people they get along with. My point is that I find it hard to believe that any of this would be deliberate enough that this kind of criticism really applies! Which is why I'd find examples helpful - to know what we're specifically speaking about :) ) 

I should clarify—I think EAs engaging in this behavior are exhibiting cult indoctrination behavior unintentionally, not intentionally. 

One specific example would be in my comment here.

I also notice that when more experienced EAs tend to talk to new EAs about x-risk from misaligned AI, they tend to present an overly narrow perspective. Sentences like "Some superintelligent AGI is going to grab all the power and then we can do nothing to stop it" are thrown around casually without stopping to examine the underlying assumptions. Then newer EAs repeat these cached phrases without having carefully formed an inside view, and the movement has worse overall epistemics. 

Here is a recent example of an EA group having a closed off social circle to the point where a person who actively embraces EA has difficulty fitting in. 

Haven't read the whole post yet but the start of Zvi's post here lists 21 EA principles which are not commonly questioned. 

I am not going to name the specific communities where I've observed culty behavior because this account is pseudoanonymous.

"creating closed social circles"

Just on this my impression is that more senior people in the EA community actively recommend not closing your social circle because, among other reasons, it's more robust to have a range of social supports from separate groups of people, and it's better epistemically not to exclusively hang out with people who already share your views on things.

Inasmuch as people's social circles shrink I don't think it's due to guidance from leaders (as in a typical cult, I would think) but rather because people naturally find it more fun to socialise with people who share their beliefs and values, even if they think that's not in their long-term best interest.

I like the "skill-mastery" framing, and have both "think about it in the shower" and "gears-level mastery" as orientations in my thinking. I didn't have deliberate practice cached as much, nor the cluster of the three, but I think it's good and reminds me of the way the rationality community talks about the need for rationalist dojos and practice to actually become more rational.

Curated and popular this week
Relevant opportunities