There are a lot of articles I've wanted to write for a long time on how those in effective altruism can help each other do more good and overall change effective altruism to even better than it is now. Yet there is a single barrier left stopping me. It's the culture of fear in effective altruism.

Criticism has become so distorted from what it should be that my intention would not even be to criticize. Yet there is no way to suggest any organization could be doing anything better without it someone interpreting it as an attempt to sabotage it. It's not that I'm afraid of how others will respond. It's that so many individual actors have come to fear each other and the community itself. It's too much of a hassle to make it worthwhile to resolve the barrage of hostility from trying to contribute to anything.

A couple months ago an article was published on the EA Forum about two researchers disclosing their experience of being subjected to widespread peer pressure to not publish a paper critical of some tendencies in long-termism. That wasn't shocking me. What shocked me was how the leadership and grantmakers in this movement with the biggest portfolios at their command were themselves shocked that happened.

A couple weeks ago, I spent a couple hours with another friend who, like me, has participated in this movement for a decade puzzling over the problem of resolving the culture of fear. There wasn't almost no progress but I posed him a scenario:

What if one person expressed to another working at an EA-affiliated organization that: 
1. 60% of what the organization is doing could not be done any better; 

2. 20% of what they're doing could be tweaked a bit to be better, 

3. and not disagree with but are curious about the rationale for 20% of what they do?

Here's how my friend responded:

Well, you might be able to get away with that, but it would be hard.

That wasn't about one cause in effective altruism. That was about all of it. The only progress I made was learning how much others feel the same as I do. It may not be helpful for me to pose all this without any specific details or proposing constructive solutions. Yet to make others aware, especially the core leadership of this movement, that the culture of fear is worse than they would have ever anticipated is a first step.

29

0
0

Reactions

0
0
Comments9
Sorted by Click to highlight new comments since: Today at 9:28 PM

Criticism has become so distorted from what it should be that my intention would not even be to criticize. Yet there is no way to suggest any organization could be doing anything better without it someone interpreting it as an attempt to sabotage it. It's not that I'm afraid of how others will respond. It's that so many individual actors have come to fear each other and the community itself. It's too much of a hassle to make it worthwhile to resolve the barrage of hostility from trying to contribute to anything.

I notice that the OP has gotten twenty upvotes--including one from me--, but that I myself have never encountered the phenomenon described. My experience, like D0TheMath's, is that people who offer criticism are  taken seriously. Other people in this comment section, at least so far, seem to have similar experiences.

Could some of the people who've experienced such chilling effects give more details about it? By PM if they don't anticipate as strongly as I do that the responses on the open forum will be civil and gracious?

Mau
2y33
0
0

Thanks for this!

I found some of this surprising. I've heard staff and leadership of a few orgs in the community (especially newer ones) say things like, "maybe this project will fail because I'm incompetent," or rattle off a list of ways their org could be doing much better. I've also seen a group of people present substantial criticism to at least one org and receive responses that were at least respectful (and in that case were followed by substantial reforms). Some public, substantive criticism (e.g. [1], [2]) has also seemed to be well received by the community. So I'm having trouble seeing where this is coming from. Maybe we've seen different parts of the picture, and there's more of a culture of fear in parts of the community that I haven't seen?

(The examples I have in mind are mostly of people being receptive to criticism that is privately expressed, is focused on tactics/strategy, and/or doesn't single out specific people/orgs. I expect people tend to be significantly more defensive in response to criticism that is public and personal, organization-specific, or values-focused. It's not clear to me that defensiveness toward just the latter is (very) harmful though--such criticism can be more destructive and harder to get right, so maybe it's good for it to need to meet a higher bar.)

If people are worried about retaliation, I also don't immediately see why anonymous criticism isn't a good alternative.

Well, you might be able to get away with that, but it would be hard.

Lastly, I think it might be helpful to disambiguate this. "It would be hard" could mean anything from "people aren't very likely to change organization policies in response to criticism" to "people are likely to retaliate by withholding professional opportunities," and the latter could be a much worse state of affairs.

I’d like to see some data on how prevalent fearing that criticizing major EA orgs will lead to negative outcomes is in EA, as well as why people think this. Anecdotal accounts mean we should include such questions on more surveys, but I’m skeptical of updating too much in favor of one particular hypothesis of the cause based on what could be just a few cases of miscommunication/irrational fears/anti-centralized-org priors/anxiety/other-things-I-can’t-think-of.

From what I’ve experienced, the opposite is largely true: that EAs mostly reward people with good criticisms with respect & status. Though granted I have little first hand experience interacting with large EA orgs, and even less giving good criticisms of them.

There are a lot of articles I've wanted to write for a long time on how those in effective altruism can help each other do more good and overall change effective altruism to even better than it is now. Yet there is a single barrier left stopping me. It's the culture of fear in effective altruism.

I suggest writing the articles anyway. I predict that unless your arguments are bad, the articles (supposing you will write 5 articles) will get >=0 karma each a week after publication, and am willing to bet $10 at 1:10 odds this is the case. We can agree on a neutral party to judge the quality of the articles if you'd like. We can adjust the odds and the karma level if you like.

Coming back to this almost a year later and having thought about this a few times, that post gets across most of what I wanted to express in this post better than I could have, so thanks for sharing it!

This post received a mixed reception a year ago when I initially posted it, though I guess I really called it. If it seems like I'm kind of boasting here, you're right. I've been vindicated and while I feel tempted to say "I told you so," I want to note here that I was more right than I was given credit for, in the name of the principle of solidarity with those who spoke up before me, as I alluded to in this post, and those who've spoken up since, in the wake of the FTX crisis, and after. 

I'm not sure how common it is for people to experience hostility to criticism within the EA movement.  I think apathy is a much more common response, one that I've experienced several times, and is no less damaging.  If you don't share the perspective as the EA organization you are criticizing, then you have an enormous burden of proof to get them to even pay attention.  Whereas someone with the same perspective can write a few sentences including words like "epistemic status confident" or "likely" and it will be absorbed with enthusiasm.  It's easy to give up; how many times would you try walking into a church and persuading the priest to convert to another religion?

EA organizations aren't unique in this flaw; there is a general human propensity to engage in motivated reasoning.  It is elevated in the EA community because everyone is trying to work on "the most important thing".  That is identity forming, so criticisms strike personal - questioning value systems and capacity to reason.

Being able to openly, honestly, and enthusiastically accept criticism is a rare skill.  It requires being as dedicated to being unsure as being right or "less wrong".  But being unsure doesn't inspire action, so those ending up in leadership roles tend to be more obstinate than their waffling counterparts.  I think organizations could correct for this by employing a jester or red team with equal status and decision making power as organizational leadership, but I've never seen this done well.

I often think of it as EA being too conservative rather than having a culture of fear, and maybe those are different things, but here's some of what I see happening.

People reason that EA orgs and people representing EA need to be respectable because this will later enable doing more good. And I'd be totally fine with that if every instance of it was clearly instrumental to doing the most good.

However, I think this goal of being respectable doesn't take long to become fixed in place, and now people are optimizing for doing the most good AND being respectable, which means they will trade off doing the most good and respectability along the efficiency frontier. Ditto for other things people might optimize for: being right, growing the movement, gaining power, etc.

To the extent that EA is about doing the most good, we should be very clear when we start trying to optimize for other things. Yes, if we optimize for doing the most good in the short term we'll likely harm ourselves in the long term, but so to does the movement harm itself by trading away doing the most good for other things that someone thinks maybe will matter rather than having a solid case that it's the right thing to do. You could argue that someone like Will MacAskill put a lot of thought into being respetable and had good reason to do it rather than just immediately do the short term thing that would have done the most good for EA but would have been weird and bad for the movement long term, but today I don't think most people are doing this sort of calculation and are instead just saying "ah, I think in EA we should be respectable or whatever" and then optimizing for that AND doing the most good, thus probably failing to get the most good. 😞

Curated and popular this week
Relevant opportunities