This may be unhelpful… I don’t think it’s possible to get to 0 instances of harassment in any human community.
I think a healthy community will do lots of prevention and also have services in place for addressing the times that prevention fails, because it will. This is a painful change of perspective from when I hoped for a 0% harm utopia.
I think EA really may have a culture problem and that we’re too passive on these issues because it’s seen as “not tractable” to fix problems like gender/power imbalances and interpersonal violence. We should still work on some of those issues because a culture of prevention is safer.
But while it’s fine to aim for 0 instances of harm as an inspiring stretch goal, it may also be honest to notice when that goal isn’t possible to achieve.
Point of confusion/disagreement: I don’t think EA is big (15k globally?). I don’t think EA has domain level experts in most fields to work with to find neglected solutions. EAs typically have (far) less than 15 years work experience in any field and in my experience, they don’t have extensive professional networks outside of EA.
We have a lot more than we did ten years ago! And I agree ITN has flaws regardless, but I wanted to point out that if those are someone’s 2 main objections to using ITN today, it might not apply.
On the Forum? Or IRL?
In real life, I’ve selected to be around very compassionate people in EA and outside EA.
On the Forum… more men who “translate” experiences into ones that other men understand and don’t feel threatened by might help. I’ve noticed Will Bradshaw does this sometimes. Ozzie too. AGB sometimes.
Kirsten, Ivy, and Julia Wise do it often too. I know that for a lot of women, it’s really frustrating to be treated so skeptically when we raise personal experiences or views that vary from men’s experiences.
When I’m 1:1 with my hyper-rational or autistic male friends in person, we figure out how to understand each other compassionately, so I know these individuals don’t usually mean to be as callous as they sound online.
When I’m with my EA women friends, we talk about our personal experience and the broader social issues with tons of nuance and we appreciate that no one is shamed for not sounding Tribe-y enough (for any given tribe).
But on the Forum, it’s just so often super annoying to engage. I try to simply talk about my actual life sometimes, but I know I’m going to ping someone’s “woke” alarm and end up in a stupid thread of comments.
So I consider the Forum fine for a certain kind of information exchange but mostly a lost cause for mutual understanding of anything that hits interpersonal emotional chords. I only comment here about that sort of thing when I have a lot of downtime and extra emotional bandwidth.
[Edited to distinguish between “you” the individual and the general “you/us/people.”]
“People have a personal responsibility to tell others to stop what they're doing if they don't feel like they want others to do those things. Don't expect others to read your mind.”
Correction: “[I believe that] People have a personal responsibility to tell [me] to stop what [I’m] doing if they don't feel like they want [me] to do those things. Don't expect [me] to read your mind.”
You can take totally that stance. I personally even like that stance sometimes and have found it empowering sometimes!
We know we can’t actually dictate a reality where everyone else takes that stance? Even if you might wish it were the case. Your version of Ought =/= Is.
For a truth-seeking community, I get really frustrated when EAs miss this point. We interact with lots of people — people who believe what you believe, people who believe basically the opposite, and everyone in between. They may believe personal responsibility means modifying your behavior to avoid possible awkwardness, rather than expecting others to inform you if they found the behavior awkward.
Regardless of professed belief, it’s quite common to behave in a way that aligns with this reality: Giving negative feedback is often uncomfortable or even scary for a lot of humans. They are inconsistent in their willingness or incentives for overcoming that aversion. People who have or are perceived to have more influence/power/authority over others should recognize that it is socially riskier, more difficult, and more aversive for well-meaning others to give them negative feedback, and act accordingly.
If we want to live according to an accurate model of reality, I think you have to be willing to cope with this reality and try to “read others’ minds” (eg empathize) more.
One can maybe refuse to date people who see “personal responsibility” differently, but one can’t refuse to work with such people.
If I see that someone refuses to accept that this variation exists in the definition of “responsible behavior” in society and in our workplaces and even across different scenarios with the same individuals, if they’re unable or unwilling to modify behavior sometimes to learn from that variation, I hope I don’t have to work with them or refer people I know to work under them.
This comment seems willfully obtuse. The person is referring to a pattern of behavior, ergo a series of comments and bad experiences. A comment that comes at the end of a series and culminates in someone trying to take corrective action is not “a single comment that led to” their action.
Please reflect on how much you might be mad/sad/hurt/fearful and saying foolish things. Maybe don’t say them, or at least come back and fix them later.
I’m really happy to see you asking this question and doing an investigation of a charity and a cause yourself. It makes intuitive sense to me that moving from a very dangerous place to a very safe one would have long term benefits to well-being and seems worth doing additional investigation on the intervention.
It’s hard to know how much risk people are facing and how much improvement people will experience by moving; migration has upsides (eg better economic opportunity) and downsides (eg isolation from family). I’m not an expert on either but would be excited to see you do any additional analysis on this that you can do and post about it.
I think an ideal version of the EA Forum would have a lot of people willing to help you analyze this issue and work out answers for yourself. Please don’t get discouraged if you don’t get a lot of feedback here though. The EA Forum has happened to develop into a place with a high concentration of people primarily interested in AI, biosecurity, and animal welfare. There are definitely people around with other interests but you might have to hunt around for them.
You asked about translation. I feel tired trying to explain this and I know that’s not your fault! But it’s why I just don’t think the Forum works well for this topic.
My guess is that talking about “women’s issues” on the Forum feels as similarly taxing to me as it does for most AI safety researchers to respond to people whose reaction to AGI concerns is, “ugh, tech bros are at it again” or even a well-intentioned, “I bet being in Silicon Valley skews your perspective on this. How many non-SV people have the kinds of concerns you mention?”
Most of us are tired of that naïve convo, esp with someone who thinks they have an informed take. Where do you even start?
Someone they trust has to say, “I’m like you and I’m concerned about this; it’s not just tech bro hype, and here’s why.” They have to translate across the inferential distance, ignorance, trust gap, and knowledge gap. They need to have the patience, time, and investment in bridging the gap.