Writing this under a fresh account because I don't want my views on this impact career opportunities.
--
TLDR: We're all aware that EA has been rocked by a series of high profile scandals recently. I believe EA is more susceptible to these kinds of scandals than most movements because EA fundamentally has a very high tolerance for deeply weird people. This tolerance leads to more acceptance of socially unacceptable behavior than would otherwise be permitted.
--
It seems uncontroversial and obviously true to me that EA is deeply fucking weird. It's easy to forget once you're inside the community, but even the basics like "Do some math to see how much good our charitable dollars do" is an unusual instinct for most regular people. Extending that into "Donate your money to save African people from diseases" is very weird for most regular people. Extending further into other 'mainstream EA' cause areas (like AI safety) ups the weird factor by several orders of magnitude. The work that many EAs do seems fundamentally bizarre to much/most of the world.
Ideas that most of the world would find patently insane - that we should care about shrimp welfare, insect welfare, trillions 0f future em-style beings - are regularly discussed, taken seriously, and given funding and institutional weight in EA. Wildly unusual social practices like polyamory are common and other unusual practices like atheism and veganism are outright the default. Anyone who's spent any amount of time in EA can probably tell you about some very odd people they've met: whether it's a guy who only wears those shoes with individual toes, or the girl who does taxidermy for fun and wants to talk to you about it for the next several hours, or the the guy who doesn't believe in showers. I don't have hard numbers but I am sure the EA community over-indexes like mad for those on the autism spectrum.
This movement might have the one of the highest 'weirdness tolerance' factors of all extant movements today.
--
This has real consequences, good and bad. Many of you have probably jumped to one of the good parts: if you want to generate new ideas, you need weirdos. There are benefits to taking in misfits and people with idiosyncratic ideas and bizarre behaviors, because sometimes those are the people with startlingly valuable new insights. This is broadly true. There are a lot of people doing objectively weird things in EA who are good, smart, kind, interesting and valuable thinkers, and who are having a positive impact on the world. I've met and admire many of them. If EA is empowering these folks to flex their weirdness for good, then I'm glad.
But there are downsides as well. If there's a big dial where one end is 'Be Intolerant Of Odd People' and one end is 'Be Tolerant of Odd People' and you crank it all the way to 100% tolerance, you're going to end up with more than just the helpful kind weirdos. You're going to end up with creeps and unhelpful, poisonous weirdos as well. You're going to end up with the people who casually invite coworkers to go to sex parties with them to experiment with BDSM toys. You're going to end up with people who say that "pedophilic relationships between very young women and older men are a good way to transfer knowledge" and also people whose first instinct is to defend such a statement as "high decoupling cognitive style". People whose reaction to accusations of misconduct is to build a probability model and try to set an 'acceptableness threshold'. You know what should worry EA? I was not the least bit surprised to see so many accusations of wildly inappropriate workplace behavior or semantic games defending abhorrent ideas/people. I thought 'yeah seems like the EA crowd'.
Without going through every alleged incident, EA needs to acknowledge that it is inherently vulnerable to this kind of thing. Scott Alexander wrote once that if you create a community whose founding principle is 'no witch hunts', you're going to end up with a few committed idealists and ten thousand witches. To at least some extent, EA is seeing that play out now. Shitty people will abuse your tendency to accept odd behaviors and beliefs. They'll use your tolerance to take advantage of other people and behave inappropriately. If tolerated, they'll often graduate to more serious forms of assault or fraud. They've already been doing it. And EA is going to keep having embarrassing incidents that damage the movement until they get this under control.
--
I think there are concrete changes the community should make in order to be less susceptible to this sort of terrible behavior.
- Be marginally less accepting of weirdness overall.
- Broadly speaking, EA already has a massive surplus of people generating weird new ideas, strange new cause areas or just bizarre stuff in general. EA has a much larger challenge in addressing existing areas competently and professionally. On the margin EA would benefit from basically just growing up. From becoming less of a counter-cultural social scene and becoming more a boring, professional environment. EA still has extremely large gaps in basic cause areas, and EA needs to scale boring competency more than it needs to scale weirdness at this stage of the movement.
- Related: Be less universal in assumptions of good faith.
- Assuming good faith is a very good rule of thumb for the community to have. It's a good starting point. But having it as a universal rule is dangerous, because people can and will abuse it. An example: I have directly, personally observed white nationalists talking about infiltrating rationalist spaces because they know they can abuse assumptions of good faith and use the 'debate it out' culture to their advantage. Be more willing to call out inappropriate, weird and/or off-putting behavior, and more willing to simply shut down certain types of people without needing to endlessly discuss or justify it. Be more willing to call obvious red flags as red flags.
- Be much, much less accepting of any intersection between romance and office/network
- EA seems to have a massive problem with people's romantic/sex lives intersecting with their professional lives. This is not normal, it's not healthy, and it shouldn't be widely accepted. Virtually every major company, university, or large organization has strict fraternization rules because they recognize that relationships + careers are a ticking time bomb. Executives at major institutions and multi-billion dollar companies are often fired in disgrace for having unethical office relationships that wouldn't even warrant a mention in EA circles.
- It shouldn't be acceptable to casually invite coworkers into your polycule. It shouldn't be acceptable to casually invite coworkers to a sex party. A company's executives sleeping together should be a major red flag, not a fun quirk. There should never have to be questions raised about whether a funder and a grantee are romantically linked. This is basic stuff out of normal society that EA seems to struggle heavily with. EA's tolerance of this sort of thing is a key reason EA is now in the midst of a sexual harassment scandal.
Strong upvote.
Three additional arguments in favor of (marginally!!!!) greater social norm enforcement:
(1)
A movement can only optimize for one thing at a time. EA should be optimizing for doing the most good.
That means sometimes, EA will need to acquiesce to social norms against behaviors that - even if fine in isolation - pose too great a risk of damaging EA's reputation and through it, EA's ability to do the most good.
This is trivially true; I think people just disagree about where the line should be drawn. But I'm honestly not sure we're drawing any lines right now, which seems suboptimal.
(2)
Punishing norm violations can be more efficient than litigating every issue in full (this is in part why humans evolved punishment norms in the first place).
And sometimes, enforcing social norms may not just more efficient; it may be more likely to reach a good outcome. For example, when the benefits of a norm are diffuse across many people and gradual, but the costs are concentrated and immediate, a collective action problem arises: the beneficiaries have little incentive to litigate the issue, while those hurt have a large incentive. Note how this interacts with point (1): reputation damages to EA at large are highly diffuse.
To strengthen this point, social norms often pass down knowledge that benefits adherents without their ever realizing it. Humans aren't good at getting the best outcomes from our individual reasoning; we're good at collective learning.
(3)
There are a lot more people in the world interested in norm violation than in doing the most good. Therefore, we should expect that a movement too tolerant of weirdness will create too high a ratio of norm-violators to helpful EAs (this is the witch hunt point made in the OP).