Hide table of contents

Foreword/Update

I wrote this post hurriedly because I wanted to express before I forgot them some related to other conversations in EA from the last couple weeks. I was thinking while I was writing it that I may not get my point across clearly and that suspicion was vindicated by a couple comments on this post. 

I've also been experimenting with ways of writing posts that are different from the conventional styles on the EA Forum. are definitely ways I could have written this better. I'm kind of getting the sense the reactions I've received to a couple posts like this now are converging on a mistake I'm making that I'll try pivoting away from in the future. 

Astral Codex Ten: The Streisand Effect That Keeps on Giving

For those who don't know, Scott Alexander is a writer and blogger at Astral Codex Ten who is popular among many in effective altruism. Seeing a rise in introspective criticism in EA (happening for reasons beyond the scope of this post), he wrote post that was a criticism of criticism of criticism. For a criticism of EA, from within EA, that Scott didn't like, he gave the example of Remmelt's post 'Some blind spots in effective altruism and rationality.'

Scott clarified multiple times that he likes the people in EA who write criticisms of EA and he thinks they're good people, so he doesn't mean anything personal by what he wrote. When asked by a commenter why he couched his own criticism of Remmelt's criticism in those terms, Scott replied

I really hate people attacking me online. It makes me miserable. And their attacks tend to be ... false. Like if someone accuses me of being greedy, or writing something because of some specific sinister agenda, or something, I usually know why I wrote things and they're just wrong.

And this blog is read by ~50,000 people. If I say something mean about some normal person without a huge audience, this may be one of the worst things that ever happen to them, in the same way that the NYT saying mean things about me was one of the worst things that ever happened to me.

And all of these people are effective altruists trying to make the world a better place, who are additionally writing their honest criticisms of EA to make it better. I hate that "this person writes a well-intentioned article intended to improve the world" ---> "they get insulted and used as an example of badness on a blog read by 50,000 people and they're forever known as the person who got this wrong". I hate that I have to write posts like this at all.

Someone told me they thought Scott had felt a need to write that post because it put an end to the epidemic of excessive criticism in EA of itself. That's why I was thinking of not writing this post. I know others in EA who feel like the continuing debate over the value of criticizing whatever is a bunch of drama and a waste of time. They'd send me a link to this post called The Virtue of Silence from Scott's former blog, Slate Star Codex, to discourage me from stoking potential drama further. The point would be for me to not contribute to the Streisand effect. From Wikipedia:

The Streisand effect is a phenomenon that occurs when an attempt to hide, remove, or censor information has the unintended consequence of increasing awareness of that information, often via the Internet.

It turns out the goal of Scott's post wasn't to end the endless cycle of criticism in EA, or at least didn't work, because he brought more attention to it yesterday with another post about that first post! The Virtue of Silence isn't relevant in this case, according to Scott himself, apparently, so here we are.

Getting back to how apologetic Scott felt a need to be for bringing attention to a specific person, like Remmelt, given that Scott also thought or felt he had to write a post like 'Criticism of Criticism of Criticism,' he could have at least saved the part criticizing Remmelt's post for a response post or comment on the Effective Altruism Forum. 

If Scott knows how bad it felt to be on the receiving end of so many personal attacks online from his experience with The New York Times, then he could have avoided unnecessarily putting that risk on Remmelt by mentioning him on his blog with ~50,000 readers!  

There are more effective altruists who read Astral Codex Ten than read the EA Forum. There are more people who read Astral Codex Ten than there are effective altruists! If he had posted it somewhere with less of a reach, he would have had less to preemptively apologize for.

Alas, Scott brought it up to his 50,000 readers again anyway! It was okay, though, because it turned out Scott had overestimated that one risk for Remmelt. Remmelt was a good sport who handled himself well in the comments, including this one, worth mentioning because Scott himself highlighted it as one of the best comments on his 'Criticism of Criticism of Criticism' post. Unfortunately, the same kind of risk Scott was worried about applied to effective altruism as a community in general as well.

What Happens When You Play Fast and Loose with Narratives About Effective Altruism

Not intentionally, but inadvertently, how Scott presented EA in his post inspired some of the best takes on effective altruism of all time, real bangers, unimpeachable and impeccable insights, like these (emphasis added in all cases for parts that are hilarious and/or notable).
Exhibit A:

I don't know precisely what is the problem with EA, because I'm not putting in the epistemic work. But I feel *very* comfortable saying "I know it when I smell it; EA smells like it has good intentions plus toxic delusions and they're not really listening".

If you want the best definition I can personally come up with, it's this: EA peer pressures people into accepting repugnant conclusions. Given that, *of course* it doesn't want real criticism.

Exhibit B:

In the case of EA, I wouldn't believe it because EA has the hallmarks of a peer pressure organization, and I think the criticism they're most likely to discount is "the negative value of peer pressure outweighs the positive value of your work". That's not a fully general criticism of organizations; it's a specific and potentially useful criticism of one type of organization.

I wouldn't tell a shy conflict averse person not to work for the US Government. But I would tell them to avoid making contact with EA.

Exhibit C:

The idea of eliminating all suffering/evil in the world is dumb. Suffering is what makes us stronger, builds character, gives us something to fight against, (the hero vs the villain story). I'm not going to say we need more racists, misogynists, or chicken eaters but trying to eliminate all of them is a mistake. We've turned 'no racism' into a paper clip maximizer... and we should stop. 

Parts of these criticisms of EA are ludicrous enough it's funny but there are parts of them (except the last one) that reiterate some common impressions of EA that are:

  1. if not inaccurate, are imprecise;
  2. based on perceiving problems in EA as a community as inherent to EA as a philosophy;
  3. exactly the kind of misconceptions about EA that the community is constantly trying to to dispel or address, on account of how strongly they might needlessly repel a lot of people who might otherwise participate in EA.

Maybe it's better discourse like this happens on a public forum away from the EA community. It at least brings impressions of EA like these to the attention of the community, so there is an opportunity to acknowledge, critical points can be acknowledged, but wrong perceptions of EA dispelled. Many effective altruists decently responded to a lot of concerns about EA in the comments on Scott's post. 

On the other hand, the media already creates enough of this kind of work for the EA community 24/7. For how important the reputation of EA is considered to be, constantly tending to concerns like this in EA takes a lot of time. Nothing Scott said or did would have added created extra work if he had published on the EA Forum. Here, he would have been able to bring the attention to the people he wanted the most to notice what he had to say. 

If there was a risk someone who was identified from a post on Astral Codex Ten getting personally attacked online, then there is a similar risk that a lot of people will end up thinking EA totally sucks for reasons Scott wouldn't himself endorse. It can't be emphasized how Scott has been, this week, risking contributing to the very kind of problem he wanted to avoid by posting on his blog with 50,000 readers

It's Signaling Most of the Way Up

One of the comments Scott highlighted as among the best on his post Zvi, whose Criticism of the Criticism Contest was a partial inspiration(?)[1] for Scott's post. Zvi began with: 

It is the dream of anyone who writes a post called Criticism of [a] Criticism Contest to then have a sort-of reply called Criticism of Criticism of Criticism.

The only question now is, do I raise to 4?

Zvi thankfully proceeds to not do that but that's not enough. The ultimate solution is to do the opposite: get as close as possible to ratcheting the number of meta levels EA discourse is on down to zero. Why is because going to meta sometimes causes everyone to become detached from and forget the reality of what are the object-level issues that you really, ostensibly, care about.

A lot of the now over 600 comments on Scott's original post were about:

  1. whether EA is an exclusively utilitarian philosophy.
  2. juxtaposing EA with religion, asking questions like, 'Is EA compatible with religion?' or 'how is EA similar to a religion?'

Meanwhile, Effective Altruism for Christians began as an outreach project years ago that is now incorporated as a non-profit with at least 500 members and 3 annual conferences under its belt. Effective Altruism for Jews is another outreach effort on a similar trajectory. Effective Altruism for Muslims posted yesterday update of their progress after having begun a few months ago. There is a Facebook group called Buddhists in Effective Altruism with almost 400 members. 

The Effective Altruism Global conference in San Francisco this weekend. It's great opportunity for maybe hundreds of effective altruists of different religious backgrounds to connect with peers who share their faith. Yet for all anyone knows, a similar number of religious readers of Astral Codex Ten who otherwise might have liked EA may now think here is no place for them in the community. It could be almost zero of Scott's religious readers, among those 50,00 readers overall, but the fact that outcome won't be tracked is part of the problem too.

What You Can Take Away from This Post

This is only one example of many of how so much meta-meta-discourse may inspire thousands of credulous people, both inside and outside of EA, to rampantly speculate, or believe false or misleading information that also hurts EA's reputation and capacity. Some of these aren't matters of debate with more research always needed. Some of them are easily verifiable matters of fact.  

Any of the examples from Astral Codex Ten are about issues that are much lower stakes than the problems in EA so many don't want to contribute to making worse. To not hurt anyone's reputation or feelings, to not stoke drama, to avoid awkwardness--that's why so much critical or important discourse about EA in the community takes are vague social media or blog posts that never make it to the EA Forum. When misleading criticisms of EA never become legible to the community as a whole, stopping the spread of misinformation can become much harder. 

If you're reading this thinking, "who, me?", as one of the effective altruists whose Twitter or Facebook posts may sometimes contribute to this problem, answering that question is left as an exercise to the reader.

  1. ^

    I might be confused or wrong about this, so someone please clarify this if you're in the know. 

3

0
0

Reactions

0
0

More posts like this

Comments9
Sorted by Click to highlight new comments since: Today at 11:27 AM

I disagree with this pretty strongly, and have been worried about this type of view in particular quite a bit recently. It seems as though a standard media strategy of EAs is, if someone publishes a hit piece on us somewhere, either sort of obscure or prominent, just ignore it and "respond" by presenting EA ideas better to begin with elsewhere. This is a way of being positive rather than negative in interactions, and avoiding signal-boosting bad criticisms. I don't know how to explain how I have such a different impression, or why so many smart people seem to disagree with me, but this looks to me like an intuitively terrible, obvious mistake.

I don't know how to explain why it feels to me so clear that if someone is searching around, finding arguments that EA is a robot cult, or secretly run by evil billionaires, or some other harsh misleading critique, and nothing you find in favor of EA written for a mainstream audience even acknowledges these critics, and instead just presents some seemingly innocuous face of EA, the net takeaway will tend towards "EA is a sinister group all of these people have been trying to blow the whistle on". Basically all normal social movements have their harsh critics, and even if they don't always respond well to them, they almost all respond to them as publicly as possible.

The excuse that the criticisms are so bad that they don't deserve the signal (which to be clear isn't one this particular post is arguing) also leads me to think this norm encourages bad epistemics, and provides a fully general excuse. I tend to think that bad criticisms of something obscure like EA are generally quite easy for EAs to write persuasive debunking pieces on, so either a public criticism is probably bad enough that publicly responding is worth the signal boost you give the original piece, or it is good enough that it deserves the signal. Surely there are some portion of criticisms that are neither, and that are hard to be persuasive against but are still bad, but we shouldn't orient the movement's entire media strategy around those. I wholeheartedly agree with this comment:

https://forum.effectivealtruism.org/posts/kageSSDLSMpuwkPKK/response-to-recent-criticisms-of-longtermism-1?commentId=WycArpwah9aveNrZs

If some EA ever had the opportunity to write a high-quality response like Avital's, or to be blunt almost any okay response, to the Torres piece in Aeon or Current Affairs, or for that matter in WSJ to their recent hit piece, I think it would be a really really good idea to do so, the EA forum is not a good enough media strategy. ACX is easy mode for this, Alexander himself is sympathetic to EA, so his main text isn't going to be a hit piece, and the harsher points in the comments are ones people can respond to directly, and he will even directly signal boost the best of these counter-criticisms, as he did. I am very scared for the EA movement if even this looks like a scary amount of daylight.

This is something I've become so concerned about I've been strongly considering posting an edited trialogue I had with some other EAs about this on an EA chat where we tried to get to the bottom of these disagreements (though I've been too busy recently), but I just wanted to use this comment as a brief opportunity to register this concern a bit in advance as well. If I am wrong, please convince me, I would be happy to be dissuaded of this but it is a very strong intuition of mine that this strategy does not end well for either our community health or public perception.

Upvoted. Thanks for taking putting in the time for the thoughtful response. I wasn't sure whether the message I was trying to get across would land when I was writing this post, so your comment confirms that.

It seems as though a standard media strategy of EAs is, if someone publishes a hit piece on us somewhere, ignore it and "respond" by presenting EA ideas better to begin with elsewhere. This is a way of being positive rather than negative in interactions, and avoiding signal-boosting bad criticisms. I don't know how to explain how I have such a different impression, or why so many smart people seem to disagree with me, but this looks to me like an intuitively terrible, obvious mistake.

 I agree with you that's the wrong kind of response to news media. I intended my post to not be about criticisms in news media, and misconceptions they might provoke, but posts by those already in the community on their own social media. 

I used Scott's post as an example because it's similar to the kind of post I'm talking about. I thought it might not land properly because Scott's blog is so big it's effect may be more comparable to the impact of a newspaper than the personal blog or social media feed of just any effective altruist. It turns out I was right it wouldn't land, so it's my mistake the point I made got muddled. 

Anyway, to reiterate, I think:

  1. Individuals already in EA who write informal criticisms for the community itself should consider posting on the EA Forum more even if they feel it may not be appropriate.
  2. Why is because those criticisms being run through a central locus allows for us in EA who are most able to verify correct or incorrect info about EA, as the cost of false info about sensitive subjects on EA among the random public may outweigh the cost of social risks, real or perceived, of posting on the EA Forum.

Some things your comment gets at that I should have been explicit about:

  • Centralizing more internal discourse to a single locus like the EA Forum is something I'm only suggesting needs to happen on the EA Forum for internal discourse. Community members having to constantly correct each other and the misconceptions we ourselves provoke is an unnecessary redundancy. Dealing with that more efficiently can free up time and attention to focus on external criticisms properly, like you suggest. Criticisms in mainstream/news media, or from outside EA entirely, shouldn't be dealt with that way.
  • Community members shouldn't be discouraged from sharing candid opinions of their own elsewhere online but it'd be preferable if they were shared on the EA Forum too more. A lot of valuable information that can become common knowledge is lost when it just sits on Twitter or Facebook.

Surely there are some portion of criticisms[...]that are hard to be persuasive against but are still bad, but we shouldn't orient the movement's entire media strategy around those.

With all the different kinds of criticism and strategizing how to respond to them, one of my points is that what's lost is responses to criticisms that aren't bad, but based on false premises, like that EA has in practice only been inclusive of utilitarianism, or may be incompatible with religion. Those are the easiest kind of misconceptions to dispel but there isn't much focus on them at all.

If some EA ever had the opportunity to write a high-quality response like Avital's, or to be blunt almost any okay response, to the Torres piece in Aeon or Current Affairs, or for that matter in WSJ to their recent hit piece, I think it would be a really really good idea to do so, the EA forum is not a good enough media strategy. 

I agree with this. I've thought about it before but I've felt skeptical publications would be receptive. I'm not aware of many in EA who've tried anything like that, so it could be worth a shot to submit to, say, Aeon. It's better that they be posted on the EA Forum than nowhere else prominent online. For what it's worth, too, I've been thinking of writing more direct responses to mainstream criticism of EA, kind of like I started trying to in this comment

The one caveat I've got left is that, as to some of the 'hit' pieces, there are some of us in EA who are in the awkward positions of not seeing them as hit pieces. They're perceived as somewhat fair coverage making some bad but also good points about EA in need of addressing. That there are problems in EA in need of change is a different kind of reason for addressing external criticism on the EA Forum. 

Thanks for the gracious response, and apologies for misunderstanding. I think I still disagree with parts of your post. I disagree with parts of his piece and think that Alexander could have done a better job getting background on Remmelt's piece before singling it out (and I also think it would have been a good idea for him to crosspost it to the forum, although he was uncomfortable with this), but I still think the piece was a net benefit as written, and didn't harm EA, or Remmelt himself, to any degree that we should be especially worried about. I do think interaction with the forum is generally beneficial, both for insiders and interested outsiders, but I'm not nearly so worried about the costs of publishing things about EA off the forum, and think many of the problems with doing this that exist in at least some cases are self-inflicted by current EA norms I would rather challenge instead.

Upvoted. No need to apologize because your criticism was valid on the basis of how I presented my case, which didn't leave my main arguments particularly clear. 

I still think the piece was a net benefit as written, and didn't harm EA, or Remmelt himself, to any degree that we should be especially worried about.

Yeah, I haven't read the comments on Scott's follow-up post yet because there were not many when I first noticed the post. I'm guessing there are more comments now and some themes among the reactions that may serve as an indicator as to the ways Scott's post have led to more or less accurate understandings of EA. 

I expect its ultimate impact will be closer to neutral than significantly positive or negative. I'm guessing that any downside of people put off by EA would only be a few people anyway. Public communication on this subject 3+ meta levels in might be intellectually interesting but in practice it's too abstract to be high-stakes for EA. 

Higher stakes, like the perception of a controversial topic in AI safety/alignment among social or professional networks adjacent to EA, might be a risk worthier of considering for a post like this. Scott himself has for years handled controversies in AI alignment like this better than most in the community. I'm more concerned about those in the community who aren't as deft as Scott not being skillful enough to avoid the mistakes in public communication about EA he is relatively competent at avoiding. 

many of the problems with doing this that exist in at least some cases are self-inflicted by current EA norms I would rather challenge instead.

I don't have as strong a sense yet as to what exactly are the main causes of this but in general I get the same impression.

Interestingly, we might not disagree on very much after all. I probably did too much pattern matching between your writing and broader impressions I get about EA media strategies. Still, glad we got to chat it out!

Yeah, me too! Thanks for the conversation!

Is the argument here that nobody should criticize effective altruism on websites that are not EA forum, because then outsiders might get a negative impression? And if so, what kind of impression would outsiders get if they knew about this proposed rule?

Upvoted for asking important clarifying questions. To answer them, the argument is not that:

  1. External critics should post on the EA Forum. There are a lot of problems with that Devin Kalish covered well in this comment.
  2. Those already in the EA community should only post on the EA Forum. It'd be preferable that they also post on the EA Forum, such as summarizing conversations on social media, or posting links to their personal blog posts.

The argument is:

  1. Community members should consider posting more on the EA Forum, even if there are perceived risks to doing so.
  2. The rationale for that can be risks just as great in community members posting exclusively off of the EA Forum in ways when inaccurate info about EA may spread in ways harder to keep track of and address.

I don't mean for this to be a hard rule. What I'd want is for the EA Forum to serve a different function as a portal where outsiders could notice that their concerns with EA are being addressed and, when legitimate, validated in a systematic way. I expect that effort would garner more appreciation than what seems to be a mostly random and disorganized approach taken in EA. That's was at least my thinking until this comment from Devin raises a lot of points about how such an endeavour may be too hard or not valuable enough, especially compared to alternatives time and energy could be invested in.

Upvoted, especially for this:

What I'd want is for the EA Forum to serve a different function as a portal where outsiders could notice that their concerns with EA are being addressed

The forum is public and will get more and more exposure as the movement grows in numbers, funding and access to power. And this is a great thing - it allows anyone who supports us, is against us, or is affected by our actions to get a fully transparent impression of what we do and why.

Curated and popular this week
Relevant opportunities