After some recent discussion on the forum and on twitter about negative experiences that women have had in EA community spaces, I wanted to start a discussion about concrete actions that could be taken to make EA spaces safer, more comortable, and more inclusive for women. The community health team describes some of their work related to interpersonal harm here, but I expect there's a lot more that the wider community can do to prevent sexual harrassment and abusive behavior, particularly when it comes to setting up norms that proactively prevent problems rather than just dealing with them afterwards. Some prompts for discussion:
- What negative experiences have you had, and what do you wish the EA community had done differently in response to them?
- What specific behaviors have you seen which you wish were less common/wish there were stronger norms against? What would have helped you push back against them?
- As the movement becomes larger and more professionalized, how can we enable people to set clear boundaries and deal with conflicts of interest in workplaces and grantmaking?
- How can we set clearer norms related to informal power structures (e.g. people who are respected or well-connected within EA, community organizers, etc)?
- What codes of conduct should we have around events like EA Global? Here's the current code; are there things which should be included in there that aren't currently (e.g. explicitly talking about not asking people out in work-related 1:1s)?
- What are the best ways to get feedback to the right people on an ongoing basis? E.g. what sort of reporting mechanisms would make sure that concerning patterns in specific EA groups get noticed early? And which ones are currently in place?
- How can we enable people who are best at creating safe, welcoming environments to share that knowledge? Are there specific posts which should be written about best practices and lessons learned (e.g. additions to the community health resources here)?
I'd welcome people's thoughts and experiences, whether detailed discussions or just off-the-cuff comments. I'm particularly excited about suggestions for ways to translate these ideas to concrete actions going forward.
EDIT: here's a google form for people who want to comment anonymously; the answers should be visible here. And feel free to reach out to me in messages or in person if you have suggestions for how to do this better.
This will be my last message in this thread, because I find this conversation upsetting every time it happens (and every time it becomes clear that nothing will change). I find it really distressing that a bunch of lovely and caring people can come together and create a community that can be so unfriendly to the victims of assault and harassment.
And I find it upsetting that these lovely and caring people can fall into serious moral failure, in the way that this is a serious moral failure from my perspective on morality (I say this while also accepting that this reflects not evilness but rather a disagreement about morality, such that the lovely, caring people really do continue to be lovely and caring and they simply disagree with me about a substantive question).
To reply to your specific comments, I certainly agree that there is room for nuance: situations can be unclear and there can be clashes of cultural norms. Navigating the moral world is difficult and we certainly need to pay attention to nuances to navigate it well.
Yet as far as I'm concerned, it remains the case that someone's contributions via their work are irrelevant to assessing how we should respond to their serious wrongdoing. It's possible to accept the existence of nuance without thinking that all nuances matter. I do not think that this nuance matters.
(I'm happy to stick to discussing serious cases of wrongdoing and simply set aside the more marginal cases. I think it would represent such a huge step forwards if EA could come to robustly act on serious wrongdoing, so I don't want to get distracted by trying to figure out the appropriate reaction to the less crucial cases.)
I cannot provide an argument for this of the form that Oliver would like, not least because his comment suggests he might prefer an argument that is ultimately consequentialist in nature even if at some layers removed, but I think this is the fundamentally wrong approach.
Everyone accepts some moral claims as fundamental. I take it as a fundamental moral claim that when a perpetrator commits a serious wrong against someone it is the nature of the wrong (and perhaps the views of the person wronged, per Jenny's comment) that determine the appropriate response. I don't expect that everyone reading this comment will agree with this, and I don't believe it's always possible to argue someone into a moral view (I think at some fundamental level, we end up having to accept irreconcilable disagreements, as much as that frustrates the EA urge to be able to use reason to settle all matters).
(At this point, we could push into hypothetical scenarios like, "what if you were literally certain that if we reacted appropriately to the wrongdoing then everyone would be tortured forever?". Would the consequences still be irrelevant? Perhaps not, but the fact of the matter is that we do not live in a hypothetical world. I will say this much: I think that the nature of the wrongdoing is the vastly dominating factor in determining how to respond to that wrongdoing. In realistic cases, it is powerful enough that we don't need to reflect on the other considerations that carry less weight in this context.)
I've said I don't expect to convince the consequentialists reading this to accept my view. What's the point then? Perhaps I simply hope to make clear just how crucial an issue of moral conscience this is for some people. And perhaps I hope that this might at least push EA to consider a compromise that is more responsive to this matter of conscience.