Writing this under a fresh account because I don't want my views on this impact career opportunities.
--
TLDR: We're all aware that EA has been rocked by a series of high profile scandals recently. I believe EA is more susceptible to these kinds of scandals than most movements because EA fundamentally has a very high tolerance for deeply weird people. This tolerance leads to more acceptance of socially unacceptable behavior than would otherwise be permitted.
--
It seems uncontroversial and obviously true to me that EA is deeply fucking weird. It's easy to forget once you're inside the community, but even the basics like "Do some math to see how much good our charitable dollars do" is an unusual instinct for most regular people. Extending that into "Donate your money to save African people from diseases" is very weird for most regular people. Extending further into other 'mainstream EA' cause areas (like AI safety) ups the weird factor by several orders of magnitude. The work that many EAs do seems fundamentally bizarre to much/most of the world.
Ideas that most of the world would find patently insane - that we should care about shrimp welfare, insect welfare, trillions 0f future em-style beings - are regularly discussed, taken seriously, and given funding and institutional weight in EA. Wildly unusual social practices like polyamory are common and other unusual practices like atheism and veganism are outright the default. Anyone who's spent any amount of time in EA can probably tell you about some very odd people they've met: whether it's a guy who only wears those shoes with individual toes, or the girl who does taxidermy for fun and wants to talk to you about it for the next several hours, or the the guy who doesn't believe in showers. I don't have hard numbers but I am sure the EA community over-indexes like mad for those on the autism spectrum.
This movement might have the one of the highest 'weirdness tolerance' factors of all extant movements today.
--
This has real consequences, good and bad. Many of you have probably jumped to one of the good parts: if you want to generate new ideas, you need weirdos. There are benefits to taking in misfits and people with idiosyncratic ideas and bizarre behaviors, because sometimes those are the people with startlingly valuable new insights. This is broadly true. There are a lot of people doing objectively weird things in EA who are good, smart, kind, interesting and valuable thinkers, and who are having a positive impact on the world. I've met and admire many of them. If EA is empowering these folks to flex their weirdness for good, then I'm glad.
But there are downsides as well. If there's a big dial where one end is 'Be Intolerant Of Odd People' and one end is 'Be Tolerant of Odd People' and you crank it all the way to 100% tolerance, you're going to end up with more than just the helpful kind weirdos. You're going to end up with creeps and unhelpful, poisonous weirdos as well. You're going to end up with the people who casually invite coworkers to go to sex parties with them to experiment with BDSM toys. You're going to end up with people who say that "pedophilic relationships between very young women and older men are a good way to transfer knowledge" and also people whose first instinct is to defend such a statement as "high decoupling cognitive style". People whose reaction to accusations of misconduct is to build a probability model and try to set an 'acceptableness threshold'. You know what should worry EA? I was not the least bit surprised to see so many accusations of wildly inappropriate workplace behavior or semantic games defending abhorrent ideas/people. I thought 'yeah seems like the EA crowd'.
Without going through every alleged incident, EA needs to acknowledge that it is inherently vulnerable to this kind of thing. Scott Alexander wrote once that if you create a community whose founding principle is 'no witch hunts', you're going to end up with a few committed idealists and ten thousand witches. To at least some extent, EA is seeing that play out now. Shitty people will abuse your tendency to accept odd behaviors and beliefs. They'll use your tolerance to take advantage of other people and behave inappropriately. If tolerated, they'll often graduate to more serious forms of assault or fraud. They've already been doing it. And EA is going to keep having embarrassing incidents that damage the movement until they get this under control.
--
I think there are concrete changes the community should make in order to be less susceptible to this sort of terrible behavior.
- Be marginally less accepting of weirdness overall.
- Broadly speaking, EA already has a massive surplus of people generating weird new ideas, strange new cause areas or just bizarre stuff in general. EA has a much larger challenge in addressing existing areas competently and professionally. On the margin EA would benefit from basically just growing up. From becoming less of a counter-cultural social scene and becoming more a boring, professional environment. EA still has extremely large gaps in basic cause areas, and EA needs to scale boring competency more than it needs to scale weirdness at this stage of the movement.
- Related: Be less universal in assumptions of good faith.
- Assuming good faith is a very good rule of thumb for the community to have. It's a good starting point. But having it as a universal rule is dangerous, because people can and will abuse it. An example: I have directly, personally observed white nationalists talking about infiltrating rationalist spaces because they know they can abuse assumptions of good faith and use the 'debate it out' culture to their advantage. Be more willing to call out inappropriate, weird and/or off-putting behavior, and more willing to simply shut down certain types of people without needing to endlessly discuss or justify it. Be more willing to call obvious red flags as red flags.
- Be much, much less accepting of any intersection between romance and office/network
- EA seems to have a massive problem with people's romantic/sex lives intersecting with their professional lives. This is not normal, it's not healthy, and it shouldn't be widely accepted. Virtually every major company, university, or large organization has strict fraternization rules because they recognize that relationships + careers are a ticking time bomb. Executives at major institutions and multi-billion dollar companies are often fired in disgrace for having unethical office relationships that wouldn't even warrant a mention in EA circles.
- It shouldn't be acceptable to casually invite coworkers into your polycule. It shouldn't be acceptable to casually invite coworkers to a sex party. A company's executives sleeping together should be a major red flag, not a fun quirk. There should never have to be questions raised about whether a funder and a grantee are romantically linked. This is basic stuff out of normal society that EA seems to struggle heavily with. EA's tolerance of this sort of thing is a key reason EA is now in the midst of a sexual harassment scandal.
I agree that a low-weirdness EA would have fewer weird scandals. I'm not sure whether these would just be replaced by more normal scandals. It probably depends a lot on exactly what changes you make? A surprisingly large fraction of the "normal" communities I've observed are perpetually riven by political infighting, personal conflicts, allegations of bad behavior, etc., to a far greater degree than is true for EA.
Choosing the right target depends on understanding what EA is doing right in addition to understanding what it's doing wrong, and protecting and cultivating the former at the same time we combat the latter.
I'm skeptical that optimizing against marginal weirdness is a good way to reduce rates of sexual misconduct, mostly for two reasons:
In the long run, I think the best way to make the EA community a healthy place is to optimize somewhat for weirdness as a secondary consideration, but mostly just optimize for a community that's honest, high-integrity, brave, compassionate, smart, skillful, thoughtful, self-aware, etc. Like, try to make EA actually virtuous, and actively push against incentives to merely seem virtuous in various regards.
I also think that optimizing against weirdness would be really bad for EAs' ability to make the world a better place, because I think EA is currently mostly bottlenecked on "vastly insufficient quantities of weird ideas, uncorrelated research directions, low-confidence exploratory attempts to try new things, etc."
The ITN framework also points in this direction: "importance, tractability, and neglectedness" is not that far off from "importance, tractability, and weirdness". You can ditch the weirder and more counter-intuitive ideas, but then you'll be competing with all the other people in the world who want to do important and tractable things that aren't socially risky or weird.
A surplus relative to what? I think that AI alignment is desperately in need of many weird new ideas, and also in serious need of a lot more honest, substantive public debate of weird and counter-intuitive strategy and governance issues.
This might just be an object-level disagreement about where EA's main positive impact is likely to come from, on our respective models of the world. E.g., if you think EA mainly has a positive impact via increasing donations to GiveDirectly, then I buy that EA's current idea pipeline might be a lot weirder than optimal for that.
Third Generation Bay Area, here - and, if you aren't going to college at Berkeley or swirling in the small cliques of SF among 800,000 people living there, yeah, not a lot of polycules. I remember when Occupy oozed its way through here that left a residue of 'say-anything-polyamorists' who were excited to share their 'pick-up artist' techniques when only other men where present. "Gurus abuse naïve hopefuls for sex" has been a recurring theme of the Bay, every few decades, but the locals don't buy it.