Hide table of contents

As many of us have seen there has recently been a surge in discourse around people in the community with different views. Many of this underlying tension has only been brought about by large scandals that have broken in the last 6 months or so.

I've seen a few people using language which, to me, seems schismatic. Discussing how there are two distinct and incompatible groups within EA, being shocked/hurt/feeling rejected by the movement, etc. I'd like to urge us to try and find reconciliation if possible.


Influential Movements avoid Early Schisms


If you look through history at any major religious/political/social movements, most of them avoid having early schisms, or if they do, it creates significant issues and tension. It seems optimal to let movements develop loosely over time and become more diverse, before starting to draw hard lines between what "is" a part of the in group and what isn't.

For instance, early Christianity had some schisms, but nothing major until the Council of Nicea in 325 A.D. This meant that Christianity could consolidate power/followers for centuries before actively breaking up into different groups.

Another parallel is the infamous Sunni-Shia split in Islam, which caused massive amounts of bloodshed and still continues to this day. This schism still echos today, for instance with the civil war in Syria.

For a more modern example, look at the New Atheism Movement which in many ways attracted similar people to EA. Relatively early on in the movement, in fact right as the movement gained popular awareness (similar to the moment right now in EA) many prominent folks in New Atheism advocated for New Atheism Plus. This was essentially an attempt to schism the movement along cultural / social justice lines, which quickly eroded the cohesion of the movement and ultimately contributed to its massive decline in relevance.

Effective Altruism as a movement is relatively brand new - we can't afford major schisms or we may not continue as a relevant cultural force in 10-20 years.


Getting Movement Building Right Matters

Something which I think is sometimes lost in community building discussions is that the stakes we're playing for are extremely high. My motivation to join EA was primarily because I saw major problems in the world, and people that were extremely dedicated to solving them. We are playing for the future, for the survival of the human race. We can't afford to let relatively petty squabbles divide us too much!

Especially with advances in AGI, I know many people in the movement are more worried than ever that we will experience significant shifts via technology over the coming decades. Some have pointed out the possibility of Value Lock-in, or that as we rapidly increase our power our values may become stagnant, especially if for instance an AGI is controlled by a group with strong, anti-pluralistic values.

Overall I hope to advocate for the idea of reconciliation within EA. We should work to disentangle our feelings from the future of the movement, and try to discuss how to have the most impact as we grow. My vote is that having a major schism is one of the worst things we could do for our impact - and is a common failure mode we should strive to avoid. 





More posts like this

Sorted by Click to highlight new comments since: Today at 11:54 PM

One thing I'd really like to see is more experimentation inside the community.

For example, there was some discussion about democratic control of funding decisions. This isn't something we need to centrally decide, you can just... start doing it. Start a fund that makes decisions democratically! I bet you'd get funding from the EAIF even.

Got a great idea for how to set up codes of conduct to make diverse people feel welcome? Try them in your local group and report back on how they work!

I wish people felt more empowered to do this kind of thing. That's how we got our existing organisations and institutions: someone just went ahead and started doing it. They weren't created when Dustin Moskovitz arose out of the sea on a shell. Maybe e.g. the EAIF could explicitly call out experimentation as something it wants to fund.

Of course, this argument only applies to stuff you can do differently. If you don't like the way other people behave and just don't want to be associated with them, that's harder to reconcile.

As someone who is a funded community builder part time, it's extremely difficult to do this kind of work. Community building is extremely competitive to get funding, and running a local group takes a lot out of you. Plus the career/funding situation is precarious and not exactly great for setting up a lucrative career path, so unfortunately I think a lot of people see community building as a temporary position or a means to getting more clout in EA.

This is especially true of university organizers since by the nature of their position, they will only be organizing temporarily.

I'd like the community building movement, especially the paid organizers, to coordinate more. I think that if we could have an annual meetup country by country, perhaps people could coordinate and piece out larger projects - for example a group of 5 paid organizers could pool some funds or work on building an ethics framework together. 

EA generally has an issue where people say "if this is a problem, why don't you go do it?" I think if we focused more on coordinating and letting people solve problems as a group, we could get a lot more done. 

I don't mean to discount your experience, but from my experience something like running a small democratically allocated fund seems pretty doable by a single person as a fun side project with some spreadsheets (probably less work than writing a 20000 word critical essay...). If you want to get serious, sure, the work ramps up. But the point is to find out whether people are interested in a low cost way.

EA generally has an issue where people say "if this is a problem, why don't you go do it?"

I think we are probably sensitive to different things :) I feel like I see a lot more people saying "why doesn't Someone (usually 'the EA movement', which isn't, you know, an agent) just implement my pet idea?". Probably people say both :upside-down-smile:

I organize part time while also working full time as the head of sales at a tech startup. I really don't have time to coordinate this type of thing, but if I could contribute a few hours a month here or there I'd be happy to do it.

The problem is most people don't have the skillset to do sales, build spreadsheets, and identify funds. It takes either an incredibly talented person with a lot of time, or a group with different skills complementing each other. 

I feel like discussions about what we'd like social norms to be  and (relatedly) how to react to "scandals" have an inherent dynamic that increases polarization. This often goes like this:

There's a scandal or the possibility of a scandal and there's a tradeoff to make with respect to several things of importance. (E.g., creating a welcoming and safe environment vs. fear that this devolves into a culture where 99% of people will end up cancelled eventually with no chance of redemption for increasingly less severe transgressions.) Many people have some opinion on where they would set this tradeoff, but different people would set the weights in different places. (And some people may just say things that they expect will be received well, increasing the momentum that the pendulum is currently swinging.) Moreover, people operate in different parts of the EA community and have widely different day-to-day experiences filtered by their personality, standing in the movement, preferred ways of socializing, things like gender or ethnicity, and so on. So, even if two people in some sense agreed that the ideal norms for the movement would set the tradeoff in one specific way, they may disagree, based on the different glimpses of the movement they catch,  about where the pendulum is currently at.

Now, since people often care really strongly about what the norms should be, it can be quite distressing if someone wants the pendulum to be at a 60 degree angle and thinks it's currently at a 120 degree angle, and then a person who wants it to be at the 120 angle comes and talks as though it's already at the 60 degree angle. While these two people only differ by 60 degrees (one wants it at 60, the other at 120), it seems to them as though they differ by 120 degrees (they both think the pendulum is currently far away from them). This impression of vast differences will cause them to argue their case even more vehemently, which further amplifies the perceived difference until it feels like 180 degrees – total opposition.

I'm not sure what to do about this. One option could be to debate less where the pendulum is exactly in a movement-wide sense (since that's impossible to begin with, given that the answer will be different for different parts of EA – both geographically and in terms of more subtle differences to what people see and experience – and also because no one should be confident about it given the limited glimpses that they catch). Instead, we could say things like "I think the pendulum is too far to the left in such and such situations (e.g., Bay area community house x)." Or, alternatively, we could focus more on what the movement should ideally look like. (E.g., maybe write down the reaction you'd like to see instead of focusing on why you don't like other people's reactions.) People will still disagree on these things, but maybe the disagreements will feel more like the 60 degrees rather than the doubled 120 degrees?

To make clear that one is making statements about where one wants the pendulum to be instead of where it's currently at, I think it's  also useful to simply acknowledge all the values at stake in the tradeoff. This makes clear to others that you at least see where they're coming from. It also makes clear that you're not engaged in frantic pendulum-pushing where you think the pendulum has to move in a specific direction at all costs without worrying about how far it already went in some places.

Lastly, maybe it would be good if people thought about what sort of viewpoints they disagree with but still find "defensible." I think it makes total sense to regard some viewpoints as indefensible and try to combat them and so on, but if we resort to that sort of reaction too quickly, then it becomes really difficult to coordinate on anything. Therefore, I often find it refreshing when people disagree in a way that makes clear that other perspectives are still heard and respected. 

Great comment. I agree that the best way forward is acknowledging that different parts of EA have different values, and making sure community organizers/high status people in the movement are aware and sensitive of that sort of thing.

For instance in my local group, I think if an EA from SF came and started propositioning people for dates/discussing polyamory, that would seriously break our norms. Perhaps having more intentional communication between group organizers could help?

For instance maybe a mandatory meeting once or twice a year for all paid organizers in a nation where people can get and idea of what is kosher where. What are your thoughts here?

If a sub-faction of EA thinks that the movement as a whole is unwelcoming to them or no longer represents their beliefs/values, it could make sense for them to schism into a movement that does represent them rather than to just leave slowly over time and form nothing. 

Avoiding a schism requires active work by the community to ensure that everyone feels welcome, valued, and respected. 


When I think about being part of the movement or not, I'm not asking whether I feel welcomed, valued, or respected. I want to feel confident that it's a group of people who have the values, culture, models, beliefs, epistemics, etc that means being part of the group will help me accomplish more of my values than if I didn't join the group.

Or in other words, I'd rather push uphill to join an unwelcoming group (perhaps very insular) that I have confidence in their ability to do good, than join a group that is all open arms and validation, but I don't think will get anything done (or get negative things done).

And to be more bold, I think if a group is trying to be very  welcoming, they will end up with a lot of members that I am doubtful share my particular nuanced approach to doing good, and with whom I'm skeptical I can build trust and collaborate because our worldviews and assumptions are just too different.

If you indicate to X group, directly or otherwise, that they're not welcome in your community, then most people who identify with X are probably gonna take you at your word and stop showing up. Some people might be like you and be willing to push past the unwelcomeness for the greater good, but these people are rare, and are not numerous enough to prevent a schism. 

Ultimately, you can't make a place welcoming for every single identity without sacrificing things. If the X is "neo-nazis", then trying to make the place welcoming for them is a mistake that would drive out everyone else. But if X is like, "Belgians", then all you have to do is not be racist towards Belgians.  

Agreed! I suppose what I’m saying is that we make sure to do that active work now before things get too far out of hand.

I’m curious how other movements have managed this issue - do you know of any examples of success here?

Adding a +1 - historical analysis of institutions and movements that have grown and succeeded in their goal, versus those that have stangated, suffered schisms, and split, would be highly valuable. I can't remember the website but maybe there should be an EA Forum post bounty on this?


Effective Altruism as a movement is relatively brand new - we can't afford major schisms or we may not continue as a relevant cultural force in 10-20 years.

If the next year looks anything like the last six months, I do not feel bullish about EA being a relevant (and positive) cultural force in 2033. I hate to say that, but I think it's important for us to be clear-eyed about the current status of the movement—and its likely trajectory—in figuring out how to move forward.

Sometimes when playing chess, it becomes clear you can't win, and your goal shifts from winning to avoiding a loss. So when you say "we can't afford major schisms," my reaction is that we may not be in a place where we can afford the best option (i.e., not schisming), or to put this in more EA terms, sometimes the most cost-effective interventions are unaffordable.

Since reading this comment, I've been thinking about what it would look like for EA to be dying. After all, billions of dollars are committed to EA, many people consider themselves to be EAs, and there are lots of organizations aligned with the movement. Given that, how could EA die? To me, EA dying might look like: (1) limited new funding being committed to EA (or funding that was committed disappearing), (2) the number of people who identify as EAs decreasing, and (3) EA organizations failing. I think we have some evidence for 3; from conversations with people, I am guessing we'll start to see (2); and I'm worried that EA is going to have an increasingly hard time finding new funding (plus the demise of FTX alone constitutes (1)). So I'm really worried that EA (as we know it) is dying.

To perhaps bend this metaphor to its breaking point, unless you think EA definitely isn't dying, it's still sensible to have an advanced directive and organ donation plan in place. In other words, I think it's worth considering what a productive schism would look like—and what the likely trajectory of EA sans-schism is—before we rule out the "divide and conquer" approach, even if schisming is an outcome we could ideally avoid.

One thing I remain steadfastly optimistic about is the creativity, brilliance, and motivation of EAs. People in this community really want to improve the world, and I believe we can, but we should take a really hard look at whether our current approach—one of being unified under the EA umbrella—will be the best one going forward.

Just to add a note of optimism: a) people always take recent news too seriously; and b) many people don't read the forum. It's easy to think that everything is gloom if you spend too much time reading the drama on the forum, but most of reality hasn't changed. We still have thousands of people deeply engaged in doing good and their projects are still going as well as they were before. There are problems, sure, but announcing death is extremely premature IMO.

I appreciate your point, but this isn't consistent with my experience. I find that the Forum seems to be more bullish on EA than both EAs and non-EAs I talk to elsewhere/privately.

[Edit: If you feel like it, I’d also appreciate a response to my substantive points. Is it that:

(1) Your framework for what it’d look like for EA to be dying is different from mine?

(2) You accept my framework, but don’t think EA currently meets the criteria I’ve delineated?

And, separately, do you disagree with my point that even if EA dying is unlikely, we should still make a contingency plan?]

I think finding out the true state of play here is really important. What signs would we look for a sign of EA movements health that follow the 3 stages you suggested above? Perhaps the rate of sign-ups to the GWWC pledge, or total EAG applications, or people signing up to EA virtual courses? Funding might be easier to track, but the numbers are always going to be skewed by Open Philanthropy, and I don't think that Dustin and Cari are going to go anywhere soon (which might update you slightly toward EA robustness?).

I guess there might be more failure modes than EA 'collapse', though we ought to watch out for it. This could be a bit of a retrenchment for the movement, where hopefully we can learn from our mistakes, improve institutions, and keep doing good in 2023 and beyond.

I'd suggest unrelenting, near-uniform public hatred as a potential failure mode (which != having many enemies or being merely unpopular). Some degree of other actors being willing to cooperate can be awfully important to effectiveness.

I don't agree with any of your criteria for "death". All of those sound totally survivable. "EA exits its recent high-growth phase" is very different from dying.

I would modify them to:

  1. Significant year-on-year decreases in funding
  2. Significant year-on-year decreases in self-identifying EAs

i.e. we transition to a negative growth regime and stay there.

And I think we could survive a lot of organizational collapse so I wouldn't even include that.


I see your point and upvoted, but schism is a pretty loaded word. Not all breakups are schisms, and it's plausible that an amicable parting would allow people to work in a movement where they will be more effective. Sometimes the kids are happier in a fairly good divorce than in a turbulent marriage.

There are costs to a unified movement. Given the most likely fault lines for a split, what would people would need to compromise on, and do you think they are willing?

To me, some kind of federalism seems worth thinking about as a way to take stress off some of the fault lines. You can choose to live in Alabama or California while still being American. You can be Baptist or Catholic and still be Christian.

What is making things non-federal today? There already are, e.g. groups for Christians in EA which have some quite different ideas to the rest of the movement but coexist pretty peacefully. Is there something more that you would want there?

In (US-style) federalism, the subunits (US states) have quite a bit of power and autonomy. I don't have to worry myself about what Alabama decides about abortion or education. Being a South Carolinian or a Oregonian is a significant part of one's political identity in a sense.

So, if for instance, if the "edgy" / "normie" divide became a critical fault line, under federalism you might see substantial meta groups focusing on one side of the line or the other with for example their own high-end conferences and internal networking. It's not a rupture because central EA orgs would still exist, but it would do somewhat less.

It's still a germ of an idea; the basic thought is that there has to be some way to let some steam out of the kettle -- creating some degree of separation -- before the kettle blows up and ruptures.

Okay, but different groups and orgs can already have different norms today, right? Nobody is enforcing conformity. The worst that can happen is that CEA can ban you from EAG, so I guess yes it would be nice to have someone else running conferences so you could go to those?

I'm not playing dumb here, I genuinely find it confusing in what ways people feel they are being coerced by a central power in EA.

Ok, I mostly agree with you, but let's reframe as a devil's advocate: what if "EA" is a shaky concept in the first place (doesn't carve reality at joints)? Would you then agree that borders should be redrawn to have a more coherent mission, even if that ends up cutting out some bits of the "old EA"?

I think we can manage to have different enclaves of EA with different norms, that still broadly agree and play nicely with each other. As a community organizer I hope to get a better idea of what different groups value so I can navigate these situations better.

Could you explain a bit more as to what you’re proposing?


Something which I think is sometimes lost in community building discussions is that the stakes we're playing for are extremely high. My motivation to join EA was primarily because I saw major problems in the world, and people that were extremely dedicated to solving them. We are playing for the future, for the survival of the human race. 


Yeah, thanks for this post. I got a bit invested in some of the recent discussions here. Then I looked on my "EA to do list" (which is long, has very little to do with the recent situation, and certainly nothing to do with sh*tstorming on the forum) and I realized I lost focus. 

So I don't know if we are going to split or not, I'm leaving it to community builders, it's not my focus area. Yes or no, my "to do list" stays the same. 

Glad to hear that. I noticed myself losing focus too - a big part of why I wrote this post.

I wish you luck on your to do list, and as a community builder well… we’ll do our best. :)

Wil - this is perfectly reasonable, and generally true. 'Avoid schisms' is prudent for movement-building.

However, let me try to steel-man a possible counter-argument. In modern culture, we see many examples of woke activists taking over movements and organizations from the inside, by making ideologically motivated demands, playing various victim cards, demanding more 'diversity and inclusion', and trying to nudge the movement/organization in partisan political directions. Ever since the 'long march through institutions' (1967) and Rules for Radicals (1971), this has been refined into an extremely  common and effective tactic, and it has arguably had very negative effects in academia, media, corporations, and governments. 

Movements and organizations often find it difficult to protect themselves from woke takeover, because they don't understand what's happening, they don't have good counter-arguments, they're too guilt-prone and easily shamed, and they're too conflict-averse.  All too often, the movement/organization feels like they face a dilemma: either (1) give in to the woke activists, submit to all their demands, and take the movement in partisan politicized directions, or (2) accept a schism in which the woke get to take over the larger portion of the original movement, and a minority of anti-woke, ornery, heterodox contrarians go off and start their own movement (which is what we're currently seeing in American academia, media, corporations, and state governments).

I hope EAs see what's happening here, and understand the clear and present dangers of a woke takeover of the EA movement. We need to find a third option between accepting a woke takeover, and falling into a woke-versus-antiwoke schism. IMHO, that third option needs to be grounded in a radically honest discussion of what's really been happening in the EA movement over the last few months.   I don't know what the optimal solution would be. But it might involve making it clear to the minority of woke activists in EA that their ideological values are simply not consistent with EA ethical and epistemic values -- just as many other kinds of political, religious, and ideological values are not consistent with EA values.

I agree with much of your underlying frustration, Geoffrey, but I worry that explicitly anti-woke sentiment could encourage the perception that the woke are "not welcome here".

So many people find and contribute to EA from woke/woke-adjacent circles like climate change activism, welfare capitalism, and animal welfare. Even if you and I disagree with their ideological views, they're still trying to improve the world, the same way as you or I. 

I hope that much the same way that EA is influenced by wokism, woke EAs are influenced by EA refine the ethics and epistemics of their ideology. I'd like to throw in a (perhaps naive) vote for not explicitly alienating woke EAs if at all possible.

Ariel -- thanks for your calm & constructive comment. I take your point that many people in those movements such as animal welfare and climate change activism tend to be woke-adjacent. I also accept that everybody thinks they're trying to improve the world, given their own values and beliefs.

It's worth having a discussion about whether EA should be explicitly anti-woke, or woke-neutral, or pro-woke (which often includes pretending not to know what 'woke' means). However, there's a variant of O'Sullivan's Law that seem to operate in modern culture, such that any organization that is not explicitly anti-woke tends to become woke.

Hi Geoffrey, I commented earlier asking what you mean by woke, and would like to clarify that I'm not "pretending not to know what 'woke' means." It's a word that I only ever see employed derisively and I am truly not sure what it's supposed to mean beyond "views that the speaker holds in contempt." So if you are able to give a sense of what "woke" means to you I would appreciate it, as that would help me understand your viewpoint.

And what specific, significant "woke" pressures are being put on EA?

(I would categorize some of the ConcernedEA platform as "woke," but I don't get the sense that those parts of the platform are getting much support.)

synonyms might be "SJW" or "DEI".

Thanks, but that doesn't actually help me engage with the objections of those who are afraid of wokism (or SJW, or DEI) weakening the movement, because each of those terms can mean so many different things. 

A sampling of ideas that seem like they could be included under the umbrella of "wokism" in an EA context:

  • "Catering at EA events should be vegan"
  • "EA spaces should be welcoming for trans people"
  • "EA would be stronger if EAs were less homogenous" 
  • "Reports of sexual assault and harassment should be taken seriously"
  • "Racism, including so-called 'scientific' racism, is a scourge"

As is probably evident from my comment history, I do happen to agree with all of these assertions. But I would be interested in engaging respectfully with someone who didn't. What I can't do is meaningfully respond to the idea that wokism, undefined, is threatening EA.

(edited to add -  if anyone disagree voting would be willing to tell me what they disagree with, I would appreciate it)

I think part of the difficulty here is that "wokism" seems to refer to a cluster of ideas and practices that seem to be a genuine cluster, but don't have especially clear boundaries or a singular easy definition.

What I do notice is that none of the ideas you listed, at least at the level of abstraction at which you listed them, are things that anyone, woke or anti-woke or anywhere in between, will disagree with. But I'll try to give some analysis of what I would understand to be woke in the general vicinity of these ideas. Note that I am not asserting any normative position myself, just trying to describe what I understand these words to mean.

I don't think veganism really has much to do with wokism. Whatever you think about EA event catering, it just seems like an orthogonal issue.

I suspect everyone would prefer that EA spaces be welcoming of trans people, but there may be disagreement on what exactly that requires on a very concrete level, or how to trade it off against other values. Should we start meetings by having everyone go around and give their pronouns? Wokism might say yes, other people (including some trans people) might say no. Should we kick people out of EA spaces for using the "wrong" pronouns? Wokism might say yes, other might say no as that is a bad tradeoff against free speech and epistemic health.

I suspect everyone thinks reports of assault and harassment should be taken seriously. Does that mean that we believe all women? Wokism might say yes, others might so no. Does that mean that people accused should be confronted with the particular accusations against them, and allowed to present evidence in response? Wokism might say no, others might say yes, good epistemics requires that.

I'm honestly not sure what specifically you mean by "so-called 'scientific' racism" or "scourge", and I'm not sure if that's a road worth going down.

Again, I'm not asserting any position myself here, just trying to help clarify what I think people mean by "wokism", in the hopes that the rest of you can have a productive conversation.

"Catering at EA events should be vegan"

none of the ideas you listed, at least at the level of abstraction at which you listed them, are things that anyone, woke or anti-woke or anywhere in between, will disagree with

This is a tangent, but raising my hand as someone who does disagree that EA events should generally have only vegan food. I think having good vegan food available is very important, and think you can make a good case for excluding meat, but the more you constrain the menu the harder it is for people to find the food they need. This is especially a problem for longer or residential events, where the downsides of a limited diet compound and going out to get different food can be logistically challenging.

I agree on food. I was careless with my qualifications, sorry about that.

I'm not "pretending to not know what woke means", I genuinely think it would be constructive for you to define what it is you mean by using it, and by explaining why you think it is a threat to EA. 

Some things I think you could mean:
-People who talk a lot about "positionality"
-People who look at white men distrustfully and assume they have bad intentions
-People who talk about diversity and inclusion and virtue signal about said things
-People who are part of the "culture wars" in the United States 

The problem is that, I genuinely do not know how you define it, and how you think this applies to EA or is some sort of threat to EA. 

Another problem is that you seem to assume you can identify whether or not someone is "woke" without actually defining what that means or really knowing the person. I don't think that's fair. I also think you are doing what I notice people on twitter do, which is look at really superficial things like how someone talks or presents themselves online, and think you can categorize them as "woke" or "not-woke". It's just very polarizing. I actually think you and I agree on a lot more than you would assume but because I disagree with you using "woke" language you assume I am "pro-woke". 

You're welcome! I agree that the discussion is worth having, and won't pretend to know the right answer. Your point that the de-facto choice may be between anti-woke and (eventually) pro-woke is legitimate.

One consideration which we might be underestimating (I'm not just saying this; I mean it :P) is the impact of ways EA could influence woke ideology:

  • Expose woke people to prioritarianism, which incorporates their perception of the effects of oppression between groups of people, while often resulting in de-facto EA conclusions.
  • Expose woke people to the possible moral significance of disenfranchised groups which are typically ignored in the public eye, such as future people and wild animals.
  • Encourage woke people to quantify their perceptions of how oppressed different groups are, and how to make tradeoffs between interventions which help different groups. This also often leads to EA conclusions. For example, under most plausible assumptions of the sentience of farmed animals, it seems likely that a given intervention in farmed animal welfare will reduce more suffering than a given anti-racist intervention.

Strong agree! The basic framework of EA, using utilitarian EV calculus to have more impact, can be adopted by folks on the left or right. People who are more into social justice and climate change can learn to have better feedback mechanisms to increase impact.

At the same time, conservative religious groups that do a ton of charity could also be led to using more effective interventions. I don’t think the EA framework needs to be politicized.

This comment implies the only relevant division is over wokery. I'm not sure why you focused only on that, but there are other ways people can practically disagree about what to do...

Wokeism has been a on/off discussion topic in EA for about as long as I can remember. My woke friends complain that EA is hopelessly anti-woke, and my anti-woke friends complain that EA is hopelessly woke. The predictions of political schism or ideological takeover keep being made, and keep being false.

In my opinion, we've already found a "third option" which works: the empathy to seek mutual understanding, the philosophical sophistication to critique fashionable ideas, and the willingness to share our perspective even when it seems unpopular.

I like this :)


In my opinion, we've already found a "third option" which works: the empathy to seek mutual understanding, the philosophical sophistication to critique fashionable ideas, and the willingness to share our perspective even when it seems unpopular.


I found parts of this 3-month old comment by a non-Western trans man writing about the masculinity-femininity divide to be really insightful and prescient.

Just as many people point to 'toxic masculinity' (which can also be present in women), I think they should also acknowledge the existence of 'toxic femininity' (which can also be present in men). FWIW, I think a lot of activists raised in (somewhat-functioning) democracies are underestimating the dangers of limiting free expression, the dangers of marginalizing people whose features were historically associated with having more power, and the possibility that they might be becoming more sensitive to things that they can otherwise overcome.

Hey Geoffrey, thanks for pointing this out. I agree it seems like you immediately got downvoted hard - I’ve strong agreed to try and correct that a bit.

I broadly agree with you on this, and I’m glad we’re having this conversation. However I think framing it this way is problematic and leads to tribalism. The way I see it the ‘woke takeover’ is really just movements growing up and learning to regulate some of their sharper edges in exchange for more social acceptance and political power.

Different movements do better or worse - New Atheism is an example that was ruined by this modulation. I’m optimistic that EA can learn to become more welcoming and palatable to normal folks on the left and right, while keeping the old guard, if we play our cards right.

The largest divide seems to be the older folks who prize unconventional dating and social norms like polyamory, radical honesty, etc, versus a lot of the more “normal” folks that may be turned off by that sort of thing. For instance leading a local group in Raleigh NC, we have a large number of people that have relatively standard intuitions about sex and relationships.

My biggest goal is learning how to increase their involvement and engagement in EA without turning them off - something we’ve already dealt with a bit from SBF.

Building that middle ground framework will be tough, do you have any ideas here as to where we can start?

The way I see it the ‘woke takeover’ is really just movements growing up and learning to regulate some of their sharper edges in exchange for more social acceptance and political power.

I don't agree with this part of the comment, but am aware that you may not have the particular context that may be informing Geoffrey's view (I say may because I don't want to claim to speak for Geoffrey). 

These two podcasts, one by Ezra Klein with Michelle Goldberg and one by the NY Times, point to the impact of what is roughly referred to in these podcasts as "identity politics" or "purity politics" (which other people may refer to as "woke politics"). The impact, according to those interviewed, on these movements and nonprofits, has been to significantly diminish their impact on the outside world. 

I also think that it would be naïve to claim that these movements were "growing up" considering how long feminism and the civil rights movement have been around. The views expressed in these podcasts also strongly disagree with your claim that they are gaining more political power.

I think these experiences, from those within nonprofits and movements on the left no less, lend support to what Geoffrey is arguing. Especially considering that the EA movement is ultimately about having the most (positive) impact on the outside world.

“ The way I see it the ‘woke takeover’ is really just movements growing up and learning to regulate some of their sharper edges in exchange for more social acceptance and political power.”

I think there is some truth in movements often “growing up” over time and I agree that in some circumstances people can confuse this with “woke takeover”, but I think it’s important to have a notion of some takeover/entryism as well.

In terms of the difference: to what extent did people in the movement naturally change their views vs. to what extent was it compelled?

I suppose protest can have its place in fixing a system, but at a certain hard-to-identify point, it essentially becomes blackmail.

Wil - thanks for the constructive reply. That's all reasonable. I've got to teach soon, but will try to respond properly later.

I'm grateful for this comment, because it's an exemplar of the kind of comment that makes me feel most disappointed by the EA community.

It's bad enough that influential EAs have caused a lot of damage to other individuals, and to the good work that might be done by the community. But it's really upsetting that a lot of the community (at least as exemplified by the comments on the forum; I know this isn't fully representative) doesn't seem to take it seriously enough. We're talking about really horrible examples of racism and sexual harassment here, not 'woke activism' gone too far. It hurts people directly, it repels others from the community, and it also makes it harder to further important causes.

It's also couched in the terms of 'rationalism' and academic integrity ("let me try to steel-man a possible counter-argument..."), rather than just coming out and saying what it is. I don't think you're (merely) trying to make a hypothetical argument. Similarly the "I hope EAs see what's [really]* happening here, and understand the clear and present dangers..." sounds alarmist to me.

*I included the [really], because it seems to me like the author of the comment is trying to lend weight to their argument by implying they are revealing something most people would otherwise miss.

I understand the frustrations you and others are voicing, but to me I think it's more a lack of competence and understanding of management/power differentials/social skills from some of the higher level EAs. I highly doubt that the upper echelons of EA are full of malicious sociopaths who are intentionally harming people.

EA has done a lot of good, and people make mistakes often. I do think we need to rectify them and punish bad behavior, but we should try and make sure we don't alienate the old guard of EA for making mistakes in the socializing/dating world. A lot of people struggle to understand what is okay and what isn't - I'd rather try and reconcile or educate them than attack each other. That's the point of this post.

Does that framing make sense to you?

Hi Wil,

My comment here was about Geoffrey Miller's comment, rather than your original post as a whole (albeit I separately took issue with your use of "relatively petty..."), so I'm not sure I follow where you're going here.

FWIW, if you're referring to recently-come-to-light examples of sexual harassment and racism when you say "it's more a lack of competence...", then I would disagree with your characterisation. I think by saying that the likes of Owen Cotton-Barratt and Nick Bostrom aren't "malicious sociopaths", and that they didn't do it 'intentionally' you fail to acknowledge the harm they've done. It's a similar line of argument to your original post when you compare the harm done with "the survival of the human race". I think it's missing the point, it's insensitive, and implies that they're not soooo bad.

I also worry when the initial reaction to someone's misdeeds is "let's make sure we don't punish them too harshly, or we'll alienate them", rather than "this is really wrong, and our first priority should be to make sure it doesn't happen again". My initial response isn't to shed a tear for the damage to the career of the person who did the wrong thing.

I disagree with your framing this as "attacking" the people that have done wrong. If anything, it's the people on the end of the sexual harassment that have been attacked.

I find it distasteful when people point to things like "EA has done a lot of good" or "EA has saved a lot of lives" in the context of revelations of sexual harassment etc. While it might be factually correct, I think it gives the sense that people think it's OK to do horrible personal things as long as you donate enough to Givewell (I very much disagree).

And one final point: I don't think "the old guard of EA" is the right frame (although I'm somewhat biased as I was involved in EA in 2011-12).  I don't believe the majority of wrongdoers are from this group, nor do I believe the majority of this group are wrongdoers.

So no, that framing does not make sense to me.

Thanks for responding. For what it’s worth I personally think OCB should be permanently resigned from a powerful position in EA, and possibly socially distanced. Strong incentives against that type of behavior, especially right now, are extremely important. I’m disappointed with the response from EVF and think it should be far harsher.

The distinction I’m trying to make is that we shouldn’t assume all powerful people in EA are bad apples as a result of this scandal breaking.

Thanks Wil. I can agree with that.

I agree with this - it is also why I disagree-voted, and no, I don't have notifications set up for Geoffrey (as mentioned in another comment by them). 

The comment felt to me like it was undermining a lot of the recent criticism regarding people in powerful positions, AT THE VERY LEAST, showing very bad judgement. The comment makes me very sad and angry. 


having a major schism is one of the worst things we could do for our impact

I definitely agree. For EA to follow the lifecycle of 'New Atheism' would be one of the worse outcomes for the movement. Thinking about some reasons why there would be a schism:

  1. Irreconcilable differences in belief between members of the EA Community and a zero-sum mentality about the resources that ought to go to different subgroups/causes in EA
  2. A breakdown of trust / perceived legitimacy between EAs leaders / institutionals / power holders and a significant section of the EA Community - especially if the latter holds little formal power
  3. Lack of belief by the community that EA can achieve its goals, or that its goals are wrong
  4. The emergence of a new powerful social movement that appeals to similar demographics as EA does

Of these 4, I think that 2 is clearly the biggest threat. I think 1 can be overcome with a commitment to pluralism, though obviously some beliefs will fall outside of EA.

In order to avoid 2, I think EA needs to take these questions of community power seriously. The positive impact of EA is a function of its ideas and people yes, but it is also a function of its organisations and institutions - be they formal or informal. Personally, I see a lot of real value in Carla Cremer's suggestion that EA have an institutional turn - and not only think about improving the institutional decision-making in EA cause areas, but also within EA itself!

(As a historical aside, I think the Sunni-Shia split is actually an interesting counterexample to what you raise here. Despite the initial series of battles and conflicts around succession from the Prophet, this didn't stop the Umayyad Caliphate from becoming one of the world's largest empires afterwards, and Islam becoming the world's second largest religion!) 

We are playing for the future, for the survival of the human race. We can't afford to let relatively petty squabbles divide us too much!

I think this is the sort of reasoning that has a) possibly contributed to some of the recent damaging behaviour by EAs and b) almost certainly contributed to the failure to take that behaviour seriously enough.

Everything is "relatively petty" when compared to the survival of the human race, but I don't think that's the relevant comparison here.

It's also the sort of reasoning that has let us get billions of dollars into funding for effective solutions, saved tens of thousands of lives, and created a broad social movement with tremendous impact and impact potential down the road.

The totalizing reasoning of EA does have negative aspects, but I don't think we should throw out the baby with the bathwater! It is possible to salvage the good parts of the EA framework while maturing the movement. To do that we have to make sure to reduce tribal tensions and work towards understanding. 

Optimize for light, rather than heat. 

Thanks for a post that I disagree with! I want to see more heretical factions form splinter groups, "EA is wrong about xyz and here's how we're going to fix it", etc. Barring a long discussion of the reference classes of social change and whether EA is closer to "feminism"/"environmentalism" or "the rockefeller foundation", I think it's deeply plausible that cohesion is actually a threat to our goals. 

  • Institutions like to preserve themselves at the expense of their goals (I halfway tongue in cheek invoke "the soviet union" to reason about this, sometimes) 
  • Groupthink. In AI alignment, researchers who don't deep down understand the threatmodels they're ostensibly trying to solve make bad research outputs, but social pressures might pressure them to try working in areas they don't really believe in. 
  • Peter Wildeford crushed it 7 years ago with a post about "the meta trap", I think cohesion exacerbates the meta trap. 
  • A wider attack surface for more vulture-related problems. 
  • To me, a corollary of Holden's emphasis on asskicking is that we want to cultivate uncorrelated skillsets. We want to make bets on expertises that could come in handy in unpredictable ways. (CoI: I'm making one such bet myself as we speak!). I think cohesion could lead to underemphasizing this. 

Related: I think some EAs target massive conversion rates, which I think is wrong. 

Interesting discussion, but I suggest in part going back to basics. I feel it would be helpful to mentally divide the nature of what is being discussed and at times hastily tossed into this forum into three general  topics:

A. intellectual diversity and an interesting debate space , which helps us all look deeper into  the real issues EA was initiated to try to address.

B. Governance failures and personnel misconduct : financial and legal red cards and suspicions, personnel scandals, examples of bad and very bad behaviour within or on the fringes of a work environment , paid or unpaid..  

C. Your very personal lives,  and your emotional state today and particularly the minute before you hit the Submit button.

Subject A.  is tricky to simultaneously encourage and keep manageable. Approaches (to vigorous debate, intellectual diversity etc..) that have a good track record are group facilitation, membership guidelines, ethics committees etc.

Subject B is addressed routinely in the rest of the world, through fairly replicable governance measures: rules, sanctions and behavioural norms. Equally applicable to  a thinktank or a construction site.  This approach is needed even more, and legally required, when it comes to managing money.  So for example, having a clear and real separation of roles to avoid a financial conflict of interest in spending donor funds  is not a schism - it is an obligation.

Subject C, in my opinion, does not really belong on a publicly accessible forum, now probably being regularly mined for journalistic content and ammunition for spoilers. Maybe it is needed, but just take it offline into a private forum with the relevant people.

The author is right to point the trend and risk of schism.  We should all be allowed to contribute in territory A - the bigger and more diverse the group the better. Debates on fundamental direction and strategy etc..can improve the outcomes.  It would be a pity if a break up happens simply because of insufficient understanding of the rationale for separation of the three topics noted above. In summary, A is what we are all here for, an investment in B enables this to continue, and C is possibly not really forum business.....

I think this post falls short of arguing compellingly for the conclusion.

  • It brings 1 positive example of a successful movement that didn't schism early one, and 2 examples of large movements that did schism and then had trouble.
    • I don't think it's illegitimate to bring suggestive examples vs a system review of movement trajectories, but I think it should be admitted that cherry-picking isn't hard for three examples.
  • There's no effort expended to establish equivalence between EA and its goals and Christianity, Islam, or Atheism at the gears level of what they're trying to do. I could argue that they're pretty different.
  • I seriously do not expect that an EA schism would result in bloodshed for centuries. Instead, it might save thousands of hours spent debating online.
  • The argument that "EA is too important" proves too much. I could just as easily say that because the stakes are so high, we can't afford to have a movement containing people with harmful beliefs, and therefore it's crucial that we schism and focus fresh with people who have True Spirit of EA or whatever. 

This is not something I fault this post for not arguing about , but I'm personally inclined to think that "longtermist" EA should not have tried to become a mass movement (which is what the examples described are), and instead should have stayed relatively small and grown extremely slowly. I suspect many people are starting to wonder whether that's true, and if so, people who want a smaller, more focused, weirder, "extreme" group of people collaborating should withdraw from the people who aspire for a welcoming, broadly palatable mass-movement, and each group will get out each other's way. 

There are historical reasons for why things developed the way that did, but I think it is clear there are some distinct cultural/wordlview clusters in EA that have different models and values, and aren't united by enough to overcome that. I think that splitting might allow both groups to continue rather than what would likely happen is one group just dissolving, or both groups dissolving except for a core of people who want to argue indefinitely.

What would convince me against splitting is if no, really, everyone here is united very strongly by some underlying core values and world beliefs, and we can make enough progress on the differences en masse. I'm skeptical, but it's good to say what might convince you.


I consider myself an active EA, but you provide very little factual information in your introduction, to me almost zero anchoring as to what proto-schisms are you referring to..? Some candidates seems long termism vs. confidently/obviously good charity operations; s-risks × x-risks or hierarchical vs. decentalized ops. I have no clue if/which of those you find painstaking or risky for the continuation of the community.–?

Curated and popular this week
Relevant opportunities