There’s two ways I can see EA develop after the latest community drama: The community can have all the necessary debates in good faith, grow past them, and become a more mature movement. Or, neither side feels heard, nothing changes, and way too many competent people lose trust in EA and focus their energies elsewhere. After all, I’ve heard people from either side declare EA’s moral bankruptcy in consequence of the Bostrom email debate.
In order to shift the balance towards a little bit more of movement maturation and a little bit less of quitting, we need to talk. The conversations to be had here are between you and me, between him and them, and not between a small band of anonymous rebels on the one side and the EA establishment at CEA on the other. The more isolated and unheard people feel in their opinions, the more angry and polarized things get. Starting to pop the bubbles starts right in our own local EA groups.
For that, I’d like to share some tools that I found useful for resolving conflict in groups. It might be worthwhile to explore them on your own, and it might be even more worthwhile to spread them in your local EA group.
But how does this fit into our other community building work?
To answer this question, I’d like to summarize my favorite EAF post on community building of all times: Jan Kulveit’s “Different forms of capital”.
He claims that there are different forms of capital we can optimize for in community building, and that there are two EA tends to over-optimize for:
- Financial capital: The total amount of donations committed and sent to our charities of choice.
- Human capital: The number of active community members and their level of commitment.
Further, he argues, there are two other forms of capital EA tends to neglect, probably because they are harder to measure:
- Network capital: The number and closeness of ties between community members.
- Structural capital: The institutions and processes we have in place for doing stuff in a sensibly structured manner.
And I’d like to add a fifth dimension:
MemeticKnowledge capital: The sum total of the useful ideas, psycho- and social technologies we have widely available in the community. An instance of memeticknowledge capital without which EA would be unthinkable is the capability to use language. Some other examples for memeticknowledge capital include knowledge about priorization, forecasting, productivity tools, about where and how to apply for grants, how we think about mental health and staying productive in the long-term. And, our strategies for addressing and resolving conflict with friends, colleagues, and fellow EAs.
I think it would be wise for EA community building to start explicitly taking all five of these factors into account. In line with that, this post aims at increasing the community’s
memetic knowledge capital. Trying these tools one-off to resolve a particular issue is good, but what I’m hoping for is that over time, they just become part of the way we do things. That way, they’d have the strongest and long-lasting positive impact on our network capital through frequent and casual prevention and repair of conflict.
Social technologies for resolving and preventing conflict
Rule 0 (coined by Seek Healing)
WHY: The longer we sit on irritations, the bigger they grow, until at some point we write very, very long and elaborate EA Forum posts. Addressing points of conflict sooner rather than later helps create less friction and improve feedback loops in groups of people, no matter how uncomfortable it initially is.
HOW: Rule 0 is a rule of thumb that goes like this:
“If you feel queasy about addressing something with somebody, that’s a sign that you should try to address it.”
Doing EA Better gave some good reasons why it is particularly hard to do this if your opinions go against EA orthodoxy. But this just means that Rule 0 is even more important: If more of us take this as a rule for life, we gain more practice in addressing difficult things, people see us doing so, and we shift EA's conflict averse customs.
Nonviolent Communication is a useful tool for doing Rule 0 properly, though it takes a while to master. Working through Lindsay/Boghossian’s “How To Have Impossible Conversations” might be the 80/20 version of learning NVC.
CFAR’s Double Crux
WHY: Often, we get all tangled up in the emotional side of a disagreement, or seem too alien to each other that we never properly find the heart of the matter. Double Crux is a tool for navigating towards the core (“crux”) of the disagreement between two people, while not paying too much attention to arguments that aren’t actually crucial for either side. It can help turn an adversarial discussion into collaborative truth-seeking.
HOW: See the respective section in the CFAR handbook. Bonus points for organizing a Double Crux workshop for your local group that uses political conflict points within the EA community as practice material. (Though this can probably done in a polarizing, net negative fashion.)
WHY: If Double Crux doesn’t work for you, Yes/No-debates offers a more structured approach to get to the same destination.
HOW: See the instructions at https://yesnodebate.org/, find someone to practice with, and off you go!
WHY: When things get heated, people tend to listen to their counterpart in order to respond rather than to understand. Usually, this results in neither side feeling heard and things just getting worse. Street Epistemology offers a useful framework and toolbox for learning to listen in order to understand, as well as for helping the person you listen to get a deeper understanding of what and why they believe. Sometimes, this mode of questioning gets people to question their own beliefs while bypassing the backfire effect. In it’s purest form, SE was developed by atheists as a tool for (consensually) nudging people to reevaluate dogmatic beliefs. Used less purely, it can be remarkably useful for being a better listener and debate partner.
HOW: https://streetepistemology.com/ offers general resources and a link to the SE Discord server with regular practice calls.
WHY: Through many generations of evolution in the ancestral environment, humans have learned to easily differentiate between friend and foe, in-group and out-group. Instinctively, we treat people we see as part of our in-group with kindness, care, and compassion, and if our brain puts someone into the category of “out-group”, it is all too easy to be inconsiderate or outright mean to them, to interpret their statements in bad faith, and to be very annoyed just by the sound of them breathing. Perspective-taking has the purpose of having your brain move someone from the out-group into the in-group drawer while building more accurate models of their reasons to be as they are.
Step 1: Pick a person you have trouble with. Imagine you are them. Speak from their perspective, in “I”-statements - either to a friend, or to an inanimate object. Start with superficial qualities like how the person you want to empathize with looks and what they do with their life, then narrow in on the context you have in common (in this case, probably EA), and move towards describing from their perspective your key disagreement, and which reasons they have to disagree with you. The start could look somewhat like this: “I’m Sam, 32 years old, white, male, black hair, about 1,80m tall, average build. I work at the Sn-Risk Prevention Network as a data scientist. …”
Step 2: Step out of that role, and talk/write about what you’ve learned. If things went well, you should have slightly more understanding for the other person and be ready to move on to plan next steps.
This may sound odd, but it actually works. I learned it as an evidence-backed method in counseling training (without making time to check the relevant studies myself), got a ton of good from it over the years, taught it in counseling trainings for teacher trainees for whom it was a mindblowing game-changer, and used it with EAs in coaching sessions who, too, found it enlightening more often than not.
Authentic Relating Games
WHY: It’s easy to be frustrated with a community and its leadership when you feel like you don’t belong. AR games are structured practices for allowing people to connect on a deeper, more personal level. While there are AR games specifically tailored to help with tense situations, the main benefit is preventive: Building enough psychological safety in a community that disagreements become easy and are a shared problem to solve, not a battle.
HOW: Under https://www.authrev.org/manuals, you can find the “Authentic Relating Games Mini-Manual” as a free download. It contains a bunch of easy-to-facilitate games along with instructions how to do so. Including Hot Seat, which has become a staple activity at EA Germany’s retreats and house parties (and one of the secret sauces of our community).
Develop a healthy hot takes-culture
WHY: Nothing is more isolating in a community than feeling like there are things you can’t say, or parts of you you can’t show. You can change that through the way you engage in casual conversation.
- Add “What is a belief you hold that is controversial in your bubble?”, followed by curiosity and the will to understand, to your list of staple conversation starters. Note: The emphasis is on "in your bubble". If this is used to create shared knowledge and acceptance of conflict lines within EA, it's a useful tool for reducing polarization. If you use it in a way that elicits beliefs that are common within EA and controversial in the wider culture, however, you may feed an "us-vs.-them"-mindset in a way that is detrimental to EA's culture.
- Read, internalize, and preach Scott Alexander’s “Kolmogorov Complicity and the Parable of Lightning”. (It’s way more readable than it sounds.)
…so what shall I do with this?
You can learn the methods yourself, self-study with friends, or organize workshops in your community. If you want somebody to facilitate a workshop for you, I might be able to connect you to someone capable and willing.
Thanks to Christopher Leong for the first, second, and final nudge to write this post. Thanks to Luz Quinonero for comments on the draft.
I'm intentionally not linking much to that drama here, because I'm undecided on whether I think more or less people should get tangled up in it. Additionally, I hope that the information in this post is more timeless than what is the latest community drama at its date of publication. I'm fairly confident that people getting tangled up in January 2023's drama one year later would be net negative.
See the debate in this comment thread if you're curious why I changed the term.
Conflict of interest: I'm friends with some people at Authentic Revolution.
Another possible name for memetic capital could be cultural capital.
That makes sense from first principles, but collides with convention in a way I'd rather not risk.
Since Bourdieu, "cultural capital" is a pretty loaded term in sociology. My off-the-cuff definition would be something along these lines: "The amount of competences and resources you have available for signalling successfully that you fit in with the upper strata of society."
Often enough, signalling cultural capital in the sociological sense and spreading memetic capital in my sense are outright incompatible goals. For example, for spreading valuable memetic capital, it might make sense to diligently follow Rule 0, and to say "Let's Yes/No-debate that!" any time a political issue arises. If you do that at a tea party hosted by the British royal family, you will probably not be invited again.
On another note, memetics is a field of knowledge that I think community builders should know way more about. It's just remarkably useful for developing intuitions around PR, infohazards, which programs to run and why, etc. Part of my intention behind choosing that handle is making memetics in general a bigger thing in EA. And just using the word "memetic" very often seems to me like a less-than-terrible way to sneak more knowledge about memetics into EA's memetic capital.
Good catch, also initially would've preferred "cultural capital".
But I'm not happy with "memetic" either. I associate memes with ideas that are selected for being easy and fun to spread, that affirm our biases, and that are spread mostly without the constraint of whether the ideas are convincing upon deeper reflection, or true or helpful for the brain that gets "infected" by the meme.
Some support for these associations from the Wikipedia introduction:
Yup, the evolution and spreading of memes is not always aligned with the good, true, and beautiful. But the same is true for genetics, the field which memetics is originally based on.
Human evolution shaped our gene pool in a way that makes us prone to some biases, some forms of prejudice, and violence. But that is just one of the many aspects of these theories. That these are facts about genetics doesn't mean that genetics and evolution themselves are evil or tainted and something you shouldn't associate yourself with unless you are a hateful bigot. If we were to follow your rule to the end, we would have to invent all the vocabulary of genetical research from scratch because racists sometimes like to talk about genetics as well. Memetics is not a trivial field of knowledge, and I think just as with genetics, obfuscating the valuable work that has already been done there by reinventing the wheel from scratch with a fresh branding is way too costly.
As we're at Wikipedia-ing, mind the introductory definition in the meme-article:
That description is entirely independent of how valuable/damaging memes are, and pretty much exactly what I mean. "The memetic capital of the EA community", then, is the amount of good and useful memes we have readily available in the community, alongside with the absence of bad and harmful memes.
Thanks, I agree that we should generally not let a useful concept be tainted by negative associations. But!
That's kinda what I mean, memes are independent of how valuable/damaging the information are, and that's one main way that in my mind it stands out among other concepts such as ideas. E.g. it seems odd to say "I'm not convinced by this meme".
Maybe "ideational capital" could fit better? This terms seems to be used in political science, e.g.
Do you have a link to a smooth definition of "ideational capital"? I googled your citation and found a book, but apparently my skill in deciphering political science essays has massively declined since university.
A meta-level remark: I notice I'm a bit emotionally attached to "memetic capital", because I've thought about these things under the term "memetics" a bunch during the last year. In addition, a person whose understanding of cultural evolution I admire uses to speak about it in terms of memetics, so there's some matters of tribal belonging at play for me. Just flagging this, because it makes me prone to rationalization and overestimating the strength of my reasons to defend the term "memetic capital".
Now to why I genuinely think "memetic capital" is more fitting:
1. It is useful not only for talking about propositional knowledge.
When I read "ideational capital", or generally "ideas", I initially think exclusively of propositional knowledge, i.e. things that can be stated as facts in natural language, like "Tel Aviv is a city in Israel." But there are other forms of knowledge than propositional knowledge. John Vervaeke, for example, describes the "4 Ps of knowledge":
- Propositional knowing (see above)
- Procedural knowing (knowledge how to do things, e.g. ride a bicycle, or fill a tax form)
- Perspectival knowing (knowledge of where and how you are situated in the world as an embodied being, e.g. where up and down is, that this is a computer and that a glass door I can't just pass through without opening it.)
- Participatory knowing (knowledge of how to move in the world. E.g., whether you feel stuck and confused staring at a bouldering problem, or just get into flow while your hands and feet find the right places at the wall almost on their own.)
In my understanding, memetics is useful to describe all four forms of knowing, while at the first glance, "ideational capital" only refers to the propositional kind. And in my opinion, the more interesting aspects of memetic capital are procedural, perspectival, and participatory knowing. It's better than nothing to have propositional knowledge that Double Crux exists and what the key steps listed in the CFAR handbook are. But the more interesting, and more important, thing is having an intuitive grasp of the spirit of the method, and intuitively, without thinking, applying it in a conversation like this one.
2. Memetics lends itself to a systemic, rather than engineering-approach to understanding and influencing social systems.
The observation that memes' evolutionary fitness is orthogonal to their usefulness points out a problem, but it also helps us get a better grasp of which strategies for spreading valuable memes might and might not work. For example, if we ask "Which core concepts should more people in EA know?", we end up writing a curriculum, like the post above. However, we can also ask "Which trajectory does EA's cultural evolution have, and how can we influence that trajectory so that it flows into a more desirable manner and direction?" Then, we might discover more and completely different attack routes. For example, we end up with a call to action and an offer to connect group organizers to facilitators like the one at the end of my post.
I wrote the following on Facebook in a precursor to this EAF post:
As it is based on evolutionary theory, I think memetics is particularly well-suited for describing and understanding processes of cultural evolution like the one outlined above that I think is currently happening in EA. And the better we understand these processes, the better can we intervene on them to prevent bad things. These bad things could be an EA-internal culture war, or even the community eventually breaking apart because it can't handle its own diversity.
When I say "memetic capital", I don't have a specific set of timeless ideas in mind that all the EAs should know/should have known about all along. Instead, I think of an ever-changing egregore of ideas, processes, traditions, social customs, social and psychotechnologies, figures of speech and whathaveyou. And the reference to evolutionary theory the term "memetic" implies feels very elegant to me for pointing at this egregore.
The crux of EA's (or any social bubble's) memetic capital is that it is hard to inventorize, hard to steer, and that in some sense, we are its servants rather than its masters. I think memetics can describe that. And in the process of describing, it can help us gain more agency over which memes we want to keep and which ones we want to get rid of.
Do let me know whether this makes sense or sounds like total gibberish to you. I'm thinking/explaining all these things for the first time. And the way I think about social systems is influenced way more by continental philosophy, psychoanalysis, systems theory, and Buddhism than by the more STEM/engineering-style approaches that are common in EA. Thus, I have no clue whether I'm at all understandable for people who read other books than me.
Thanks for the elaboration, no, that all felt very comprehensible to me! Hopefully not ghibbering too much as well:
I agree that "idea" is more strongly connected with propositional knowledge, which is suboptimal, and that "knowledge" seems preferable as it covers the other types you bring up. I just googled "knowledge capital" and that might fit nicely and seems like it's actually an existing term from economics that overlaps a lot with what you have in mind (though of course missing the side-benefit of alluding to potentially neglected forms of analysis that you mention).
Memetic's lack of ephasis on argument still worries me
One potential point of disagreement that is also related to my discomfort with memetics is this sentence:
I generally feel icky about strategies that try to affect the cultural development of a community when this is not done transparently and deliberately. And memes kind of feel unilateral and uncooperative this way. For example say you'd like the EA community to engage more with authentic relating training to prevent unproductive and unnecessary conflicts. One way to do this would be to make a case for it on the EA forum, giving arguments and let people see and get excited about your vision. Another approach would be to think about ways to put authentic relating into easily digestable chunks that include positive vibes about authentic relating and an association of authentic relating with effectiveness or whatever. :D
You bring up that memetics helps with understanding failure modes in cultural evolution, and that seems benign and useful to me. But I'd wish that the interventions based on that understanding are also done transparently and cooperatively, and while you probably agree with that, I still worry that a memetic approach tends to emphasize the cooperative improvement and truth-seeking less than e.g. "cultivating EA' s shared knowledge (capital)".
Agreed, "knowledge capital" fits well.
And though I sometimes sound a whole lot Slytherin, I absolutely don't want to normalize using the Dark Arts in EA community building. I'll change the term in the initial post and link to this comment thread.
Severin -- thanks very much for listing and describing these useful techniques, and for including good links for everybody who wants to learn more about them.
I appreciate your general message here that there are time-tested social technologies for managing and mediating conflict, and EA might as well learn to use them when we most need them.
I loved this, thank you for writing it!