People doing meta-EA sometimes jokingly frame their work as “I want to indoctrinate people into EA, sort of like what cults do, but don’t worry! Haha! What we do is fine because EA isn’t a cult.”
I think this is a harmful mindset. When thinking about how to get more people to be EA-aligned, I think skill mastery is a better model than cult indoctrination.
Skill mastery often has these attributes:
- Repeated deliberate practice.
- “Thinking about it in the shower”, i.e. thinking about it without much effort.
- “Gears-level understanding”, i.e. knowing the foundations of the skill and understanding how all the pieces relate.
Cult indoctrination is more about gaslighting people into believing that there can be no other truth than X. They do this by repeating talking points and creating closed social circles.
Accordingly, when thinking about how to get more people to be EA-aligned, here are some good questions to ask:
- Can we build structures which enable repeated deliberate practice of EA? Good examples are intro fellowships, book recommendations, and club meetings. Are there more?
- Can we get people to “think about EA in the shower”? One way to improve this could be to provide better-written reading materials which pose questions which are amenable to shower thoughts.
- Can we encourage more “gears-level understanding” of EA concepts? For example, emphasize the reasons behind x-risk calculations rather than their conclusions.
It is also probably a bad idea for EA to resemble a cult, because cults have bad epistemics. Accordingly, here are some paths to avoid going down:
- Repeating talking points: when discussing EA topics with a skeptical non-EA, don’t repeat standard EA talking points if they’re not resonating. It is useless to say “AI Safety is a pressing problem because superintelligent AGI may pose existential risk” if they do not believe superintelligent AGI could ever possibly exist. Instead, you can have a more intellectually honest conversation by first understanding what their current worldview and model of AI is, and building off of this. In other words it is important to adopt good pedagogy: building from the student's foundations, rather than instructing them to memorize isolated facts.
- Closed social circles: for example, in the setting of a university group, it is probably a bad idea to create an atmosphere where people new to EA feel out of place.
The central idea here is that promoting gears-level understanding of EA concepts is important. Gears-level understanding often has repeated deliberate practice and shower thoughts as a prerequisite, so skill mastery and gears-level understanding are closely related goals.
I would rather live in a world with people who have their own sound models of x-risk and other pressing problems, even if they substantially differ from the standard EA viewpoint, than a world of people who are fully on board with the standard EA viewpoints but don’t have a complete mastery of the ideas behind them.
Summary: People who try to get more people to be EA-aligned often use techniques associated with cult indoctrination, such as repeating talking points and creating closed social circles. Instead, I think it is more useful to think about EA-alignment as a skill that a person can master. Accordingly, good techniques to employ are repeated deliberate practice, "thinking about it in the shower", and promoting gears-level understanding.
Hi :) I'm surprised by this post. Doing full-time community building myself, I have a really hard time imagining that any group (or sensible individual) would use these 'cult indoctrination techniques' as strategies to get other people interested in EA.
Was wondering if you could share anything more about specific examples / communities where you have found this happening? I'd find that helpful for knowing how to relate to this content as a community builder myself! :-)
(To be clear, I could imagine repeating talking points and closed social circles happening as side effects of other things - more specifically of individuals often not being that good at following what a good argument is and therefore repeating something that seems salient to them, and of people naturally creating social circles with people they get along with. My point is that I find it hard to believe that any of this would be deliberate enough that this kind of criticism really applies! Which is why I'd find examples helpful - to know what we're specifically speaking about :) )
I should clarify—I think EAs engaging in this behavior are exhibiting cult indoctrination behavior unintentionally, not intentionally.
One specific example would be in my comment here.
I also notice that when more experienced EAs tend to talk to new EAs about x-risk from misaligned AI, they tend to present an overly narrow perspective. Sentences like "Some superintelligent AGI is going to grab all the power and then we can do nothing to stop it" are thrown around casually without stopping to examine the underlying assumptions. Then newer EAs repeat the... (read more)