Effective altruism is based on the core belief that all people count equally. We unequivocally condemn Nick Bostrom’s recklessly flawed and reprehensible words. We reject this unacceptable racist language, and the callous discussion of ideas that can and have harmed Black people. It is fundamentally inconsistent with our mission of building an inclusive and welcoming community.
— The Centre for Effective Altruism
Here's a VERY unharitable idea (that I hope will not be removed because it could be true, and if so might be useful for EAs to think about):
Others have pointed to the rationalist transplant versus EA native divide. I can't help but feel that this is a big part of the issue we're seeing here.
I would guess that the average "EA native" is motivated primarily by their desire to do good. They might have strong emotions regarding human happiness and suffering, which might bias them against a letter using prima facie hurtful language. They are also probably a high decoupler and value stuff like epistemic integrity - after all, EA breaks from intuitive morality a lot - but their first impulses are to consider consequences and goodness.
I would guess that the average "rationalist transplant" is motivated primarily by their love of epistemic integrity and the like. They might have a bias in favor of violating social norms, which might bias them in favor of a letter using hurtful language. They probably also value social welfare (they wouldn't be here if they didn't) but their first impulses favor finding a norm-breaking truth. It may even be a somewhat deolontogical impulse: it's good to challenge social norms in search of truth, independent of whether it creates good consequences.
I believe the EA native impulse seems more helpful to the EA cause than the rationalist impulse.
And I worry the rationalist impulse may even be actively harmful if it dilutes EA's core values. For example, in this post a rationalist transplant describes themself as motivated by status instead of morality. This seems very bad to me.
Again, I recognize that this is a VERY uncharitable view. I'd like to hasten to say that there are probably a great many rationalist-transplants whose commitment to advancing social welfare are equal to or greater than mine, as an EA native. My argument is about group averages, not individual characteristics.
...
Okay, yes, I found that last sentence really enjoyable to write, guilty as charged