I'm one of the contact people for the effective altruism community. I work at CEA as a community liaison, trying to support the EA community in addressing problems and being a healthy and welcoming community.
Please feel free to contact me at julia.wise@centreforeffectivealtruism.org.
Besides effective altruism, I'm interested in folk dance and trying to keep up with my three children.
Glad to see more attention on this area!
A little spot-checking:
"People with nothing more than a high-school diploma and a month long crash course can treat PTSD ~75% as well as a professional therapist." The metastudy linked doesn't attempt to compare lay counselors with professional therapists; it's only about trained lay counselors.
Thank you for writing about an important subject! I’m sorry about the ways I gather EA has been difficult for you. I’ve found EA pretty emotionally difficult myself at times.
People who fill out the EA Survey are likely to report that EA has a neutral or positive effect on their mental health. This might be because participating in a community and having a sense of purpose can be helpful for people's wellbeing. Of course, you’d expect bias here because people who find EA damaging may be especially likely to leave the community and not take the survey. An excerpt from a colleague’s summary:
“An interesting bit of information is that the 2022 EA survey asked how EA had affected the mental health of individuals in the community. While some people reported that their mental health had reduced as a result of being part of EA, on average, most people reported improved mental health. Obviously, there is some sampling bias here in who filled out the survey. Still, this was more positive than I expected. That’s not to say that we can’t do better - it would be really great if no one was in a situation where they found that this was personally harmful for them.
. . . I asked Rethink Priorities to do a more thorough analysis of this question. They’ve now done this! TL;DR: There are only small differences in responses across cause area/engagement level/location/career level/time in EA (students + newcomers were slightly more likely to say EA improved their mental health than other groups).”
source: EA Survey 2022
About existing efforts on mental health in EA (some of which are mentioned in other comments):
I’ll note that I think it’s good to have mental health resources tailored to specific communities / populations, but this doesn’t necessarily mean much about the prevalence of problems in those populations. E.g. there are lots of therapy resources aimed at people with climate anxiety, therapists who specialize in treating medical professionals, clergy, etc.
Cross-posting Georgia Ray's / @eukaryote's "I got dysentery so you don't have to," a fascinating read on participating in a human challenge trial.
Glad this question-and-answer happened!
A meta note that sometimes people post questions aimed at an organization but don't flag it to the actual org. I think it's a good practice to flag questions to the org, otherwise you risk:
- someone not at the org answers the question, often with information that's incorrect or out of date
- the org never sees the question and looks out-of-touch for not answering
- comms staff at the org feel they need to comb public spaces for questions and comments about them, lest they look like they're ignoring people
(This doesn't mean you can't ask questions in public places, but email the org sending them the link!)
(Writing personally, not organizationally)
I'm happy people are trying experiments like this!
Thinking about other ways that people incorporate each other's judgement about where to donate: often it involves knowing the specific people.
I think some people who knew each other through early EA / GWWC did this — some had a comparative advantage in finance so went into earning to give, and others had a comparative advantage in research or founding organizations so went into nonprofits. But they made heavy use of each other's advice, because they knew each other's strengths.
It's also common to do this within a couple / family. My husband spent 10 years earning to give while I worked in social work and nonprofits, so he's earned the large majority of what we've donated. Early on, the two of us made separate decisions about where to donate our own earnings (though very informed by talking with each other). Later we moved to making a shared decision on where we'd donate our shared pot of money. This isn't necessarily the best system — people are biased toward trusting their family even in domains where the person isn't very competent, and you can see examples like the Buffett family where family members seem to make kind of random decisions.
I feel good about people pooling judgement when they know the strengths and weaknesses of the specific other people involved. I feel much less excited about pooling judgement with people whose judgement I know nothing about.
I found this a clear explanation of the costs and benefits - thanks for writing it up!
A similar issue: lack of iodization in Europe, the region where children have the highest rates of low iodine. https://www.who.int/publications/i/item/9789241593960
There’s an asymmetry between people/orgs that are more willing to publicly write impressions and things they’ve heard, and people/orgs that don’t do much of that. You could call the continuum “transparent and communicative, vs locked down and secretive” or “recklessly repeating rumors and speculation, vs professional” depending on your views!
When I see public comments about the inner workings of an organization by people who don’t work there, I often also hear other people who know more about the org privately say “That’s not true.” But they have other things to do with their workday than write a correction to a comment on the Forum or LessWrong, get it checked by their org’s communications staff, and then follow whatever discussion comes from it.
A downside is that if an organization isn’t prioritizing back-and-forth with the community, of course there will be more mystery and more speculations that are inaccurate but go uncorrected. That’s frustrating, but it’s a standard way that many organizations operate, both in EA and in other spaces.
There are some good reasons to be slower and more coordinated about communications. For example, I remember a time when an org was criticized, and a board member commented defending the org. But the board member was factually wrong about at least one claim, and the org then needed to walk back wrong information. It would have been clearer and less embarrassing for everyone if they’d all waited a day or two to get on the same page and write a response with the correct facts. This process is worth doing for some important discussions, but few organizations will prioritize doing this every time someone is wrong on the internet.
So what’s a reader to do?
When you see a claim that an org is doing some shady-sounding thing, made by someone who doesn’t work at that org, remember the asymmetry. These situations will look identical to most readers: