Against opposing SJ activism/cancellations

by ChichikoBendeliani1 min read18th Jun 202027 comments

4

Movement StrategyEA MessagingCommunity
Frontpage

ETA: I am NOT saying we're currently living in a cultural revolution, or that the current trajectory of things will lead to a cultural revolution with probability 1. However, I place greater than 1% but less than 10% chance that the US will end up in something not too far away from the cultural revolution in our generation, and that this is correlated with the rest of the Anglophone world (~70%). To the extent that wrongful cancellations matter to our movement at all, I think almost all of the harm will be in the tails.

In response to posts like this one.

Strong opinion, loosely held:

I think it’d be bad for large groups of EAs or rationalists to wade in on social justice issues, particularly defending “problematic” people who might be cancelled or decrying dangers of guilt by association.

Suppose you’re an abolitionist in the late 17th century. Your number one rallying cry is slavery delenda est. Every waking moment, you champion the end of the hideous blot to humanity. You have a small number of allies and some sympathizers, but a lot of respectable people are embarrassed to be around you, and you’ve certainly made some powerful enemies (though none are near you).

Recently, you hear something about witches and witch-hunters, where a small number of people are accused of being witches, and worse, even “witch-sympathizers” are censured and denigrated (though no witch-sympathizer has been burned...yet). A distant acquaintance of an acquaintance of yours is accused of being a witch. You’re pretty sure witches aren’t real, and besides, your acquaintance-of-an-acquaintance is a perfectly fine person. Should you go to town hall in support of her? If so, should you rally your fellow abolitionists to also help defend purported witches?

I think it’s pretty obvious that in many situations you shouldn’t do this, since the risk of damage to your movement (and honestly, likely the personal risk) is probably not worth the extremely marginal decrease in the probability of the alleged witch being burned. I feel more strongly about group actions than I do about individual actions.

A friend of mine has parents who lived through the cultural revolution. At least one grandparent made a minor political misplay (his supervisor wanted him to cover up embezzling resources, he refused) and had his entire family history (including minor land ownership in an ancestor) dragged out of him. He was demoted, berated for years, had trash thrown at him etc. This seemed unfortunate, and likely limited his altruistic impact.

As experiences with the Cultural Revolution goes, this was also likely one of the lighter ones. Other people were not nearly so lucky.

As a general strategy, it seems much better for most people in the community to watch what they say in public somewhat, be careful with their public associations, and minimize public contact with any associations that could be seen as potentially problematic.

Individuals can do so as part of a power play to the right/anti-SJ left, or because of their own convictions/spare time interests/personal friendships and loyalties, but doing so as a group is a dangerous correlated risk to the movement.

This is before getting into substantive critiques of whether the person in question is wrongfully accused. If witches are real, straying away from your mission is even less cost-effective.


ETA: If people do not wish to disagree with me publicly, happy to copy and paste comments from others if you PM me here. You can also ask forum moderators to relay comments.

4

27 comments, sorted by Highlighting new comments since Today at 6:33 AM
New Comment

I disagree. It seems to me that the EA community's strength, goodness, and power lie almost entirely in our ability to reason well (so as to be actually be "effective", rather than merely tribal/random). It lies in our ability to trust in the integrity of one anothers' speech and reasoning, and to talk together to figure out what's true.

Finding the real leverage points in the world is probably worth orders of magnitude in our impact. Our ability to think honestly and speak accurately and openly with each other seems to me to be a key part of how we access those "orders of magnitude of impact."

In contrast, our ability to have more money/followers/etc. (via not ending up on the wrong side of a cultural revolution, etc.) seems to me to be worth... something, in expectation, but not as much as our ability to think and speak together is worth.

(There's a lot to work out here, in terms of trying to either do the estimates in EV terms, or trying to work out the decision theory / virtue ethics of the matter. I would love to try to discuss in detail, back and forth, and see if we can work this out. I do not think this should be super obvious in either direction from the get go, although at this point my opinion is pretty strongly in the direction I am naming. Please do discuss if you're up for it.)

First of all, thanks so much for your time for providing an insightful (and poetic!) comment.

It seems to me that the EA community's strength, goodness, and power lie almost entirely in our ability to reason well

Mostly agreed. I think "reasoning well" hides a lot of details though, eg. a lot of the time people reason poorly due to specific incentives than because of their general inability to reason.

Finding the real leverage points in the world is probably worth orders of magnitude in our impact.

Agreed

Our ability to think honestly and speak accurately and openly with each other seems to me to be a key part of how we access those "orders of magnitude of impact."

Agreed, but I think the more relevant question is whether the expected harm of being up against the wall in a cultural revolution is likely to hinder our accuracy more than the expected accuracy loss of some selective self-censorship, particularly in public.

I do find The Weapon of Openness moderately persuasive as a counterargument, as well as the empirical results of the pro- and anti- censorship questions raised around covid.

In contrast, our ability to have more money/followers/etc. (via not ending up on the wrong side of a cultural revolution, etc.) seems to me to be worth... something, in expectation

I think you're being really cavalier about being on the wrong side of the cultural revolution. Maybe the revolution will be light, or it won't happen at all, but if we're on the wrong side of a cultural revolution half as big as China's, I think the movement de facto cannot exist in the Anglophone world if we're seen to be as the wrong side.

I also think you're maybe modeling this as me proposing that we as a community strongly side with the winning side here and try to acquire power and influence that way, which I empathetically am not. Instead I'm mostly proposing that most of us treat the possibility of a cultural revolution like the weather, and don't fight hurricanes until we understand geo-engineering much better.

I'm leaving open the possibility that a small number of us should try to be on either side of this, whether in a "vote your conscience" way or because they individually think they want resources or whatever, but by default I think our movement is best protected by not trying to acquire lots of political power or to fight revolutions.

I would love to try to discuss in detail, back and forth, and see if we can work this out.

I will try my best to talk about this more, but I can't promise I'll respond. I'm both pretty busy with work and (this is closer to my true rejection) find talking about these concepts kinda emotionally exhausting.

I received this as a private message:

Hi, this is meant to be a reply to your reply to Anna. Please post it for me. [...]
Agreed that Anna seems to be misinterpreting you or not addressing your main point. The biggest question in my mind is whether EA will be on the wrong side of the revolution anyway, because we're an ideological competitor and a bundle of resources that can be expropriated. Even if that's the case though, maybe we still have to play the odds and just hope to fly under the radar somehow.
Seems like hiring some history professors as consultants might be a good use of money for EA orgs at this point. It would be really helpful to have answers to questions like: Did any society ever manage to stop a cultural revolution after it has progressed to a stage analogous to the current one, and if so how (aside from letting it exhaust itself)? From historical precedent can we predict whether EA will be targeted? Were there relatively small groups that managed to survive these revolutions with their people/culture/property/relationships intact and if so how?

FWIW, I don't think a cultural revolution is very likely, just likely enough (>1%) that we shouldn't only think about object-level considerations when deciding whether to sign a petition or speak out publicly in support of someone.

I also suspect history professors will not be able to answer this honestly and dispassionately in worlds where a cultural revolution is likely.

I don't think the above reply is supposed to be pasted twice?

This post seems doomed to low karma regardless of its quality. You'll get downvotes from people who support aggressive SJ activism, people who think it's very bad and we should fight it, and people who think talking about this at all in public is unwise.

Not that that low karma necessarily means it shouldn't have been written. I fall somewhere between the second group and the third, but I didn't downvote this. I don't fully agree with the argument laid out here (if I did, I think I'd probably think the post shouldn't have been published), but I'm moderately glad the post exists.

I really don't like this about the voting system. My read is that you (Chichiko) provided some points on one side of an uncomfortable discussion. Most readers seem to overall agree with the other side. My impression is that they used their downvotes to voice their high level opinion, rather than because they found your specific points to be bad.

I feel quite strange about this but feel that we're in some kind of meta-level argument of censorship; that any points in-favor of occasional censorship quickly get censored. By downvoting this piece so much, that's kind of what's happening.

I think, with our limited capacity for social consensus, and our high-IQ bias towards being contrarian, having the norm of bashing (not censoring per se) pro-censorship ideas is beneficial.

There are 34 votes on this post, so at least I'm comforted slightly by the nonzero number of people who think it's not terrible.

Someone wrote this to me privately. I agree with the substance of the criticism and has since edited the post accordingly.

> As a general strategy, it seems much better for most people in the community to [...] quickly disavow any associations that could be seen as potentially problematic.
This part seems objectionable to me even if I agreed with the rest of your post.
1. Public disavowal can increase the chance that the accused person will suffer unjust bad outcomes. This starts to slide away from ‘don’t protect your peers from being burned as witches’ to ‘preemptively help burn your peers as witches so you don’t get burned yourself’. This seems like bad game theory to me, so I question its wisdom from a selfish perspective. And it seems especially bad altruistically, if the people you’re helping burn are fellow EAs.
2. If you don’t think the accused person is actually a witch, then helping burn or condemn them also seems like a violation of some of the important ethical principles that helps people coordinate.
If I expect my peers to lie or stab me in the back as soon as this seems useful to them, then I’ll be a lot less willing and able to work with them. This can lead to a bad feedback loop, where EAs distrust each other more and more as they become more willing to betray each other.
Highly knowledgeable and principled people will tend to be more attracted to groups that show honesty, courage, and integrity. There are a lot of contracts and cooperative arrangements that are possible between people who have different goals, but some level of trust. Losing that baseline level of trust can be extremely costly and cause mutually beneficial trades to be replaced by exploitative or mutually destructive dynamics.
Camaraderie gets things done. If you can create a group where people expect to have each other’s back, and expect to be defended if someone lies about them, then I think that makes the group much more attractive to belong to, and helps with important things like internal cooperation.
But even absent camaraderie, basic norms of civil discourse get an awful lot done too. Norms like ‘we won’t help make things worse for you and spread misinformation about you if someone is unethically targeting you’ get you a lot, even if you lose the valuable norm ‘we’ll defend you if someone is unethically targeting you’.
3. Another problem with denouncing people who you don’t think deserve denunciation is that it puts you on the record about any person, group, or idea anyone ever wants to make you publicly weigh in on. If you refused to participate in the witch-hunting as a matter of principle, then this might lose you some reputational capital in the near term, but in the long term it would make it harder for people to infer ‘oh, so you do endorse this other thing’ from your decision not to disavow something later.
One way of thinking about the free speech meme, 'though I disagree with what someone says, I'll defend to the death their right to say it', is that it’s functioning as exactly this kind of game-theoretic strategy right now. On this way of doing things, people get to avoid condemning each other as witches in academia - in fact, they even get to actively work to help and protect each other, in cases where they think the accusations are unjust, harmful, or false - all without ever endorsing or disavowing any of the actual positions under discussion.
To the extent this works, it works because a large group of people has agreed to an explicit strategy of protecting people even when they disagree with or dislike them. This lets you protect the falsely accused (or at least avoid accusing them of witchcraft yourself) without going on public record about every accusation that’s currently blowing up on social media.
This strategy doesn’t make them immune to cancelation, but especially so long as the strategy is widespread, it provides massively more leeway and protection against discourse evolving toward Red Scare dynamics.

The revised statement is:

"As a general strategy, it seems much better for most people in the community to watch what they say in public somewhat, be careful with their public associations, and minimize public contact with any associations that could be seen as potentially problematic."

More broadly, I think the thing I'm most worried about is altruistic nerds not thinking about the second order considerations at all, rather than any object level suggestions.

We needn't take on reputational risk unnecessarily, but if it is possible for EAs to coordinate to stop a Cultural Revolution, that would seem to be a Cause X candidate. Toby Ord describes a great-power war as an existential risk factor, as it would hurt our odds on: AI, nuclear war, and climate change, all at once. I think losing free expression would also qualify as an existential risk factor.

I'm extremely skeptical of EAs' ability to coordinate to stop a Cultural Revolution. "Politics is the mind killer." Better to treat it like the weather and focus on the things that actually matter and we have a chance of affecting, and that our movement has a comparative advantage in (figuring out things about physical reality and plugging in holes in places left dangerously unguarded).


It also doesn't seem that important in the grand scheme of things; relative to the much more direct existential risks.

I am also highly uncertain of EAs' ability to intervene in cultural change, but I do want us to take a hard look at it and discuss it. It may be a cause that is tractable early on, but hopeless if ignored.

You may not think Hsu's case "actually matters", but how many turns of the wheel is it before it is someone else?

Peter Singer has taken enough controversial stances to be "cancelled" from any direction. I want the next Singer(s) to still feel free to try to figure out what really matters, and what we should do.

This post describes related concerns, and helpfully links to previous discussions in Appendix 1.

I'm glad Singer has survived through stuff (and indeed, arguably his willingness to say true&controversial things is part of his appeal). For what it's worth, there's historical precedent for selective self-censorship of true views from our predecessors, cf Bentham's unpublished essay on homosexuality:

discussed the essay in the light of 18th-century legal opinion and quoted Bentham's manuscript notes that reveal his anxieties about expressing his views

The decline of Mohism seems like a good cautionary tale of a movement that tries to both a) get political and b) not be aware of political considerations.

I agree that if it were possible to stop it, we should, but the EA movement is only a few thousand people. Even if we devoted all our resources to this issue, I doubt EA has enough influence over broad political trends to make much difference.

First they came for the Communists
And I did not speak out
Because I was not a Communist

Then they came for the Socialists
And I did not speak out
Because I was not a Socialist

Then they came for the trade unionists
And I did not speak out
Because I was not a trade unionist

Then they came for the Jews
And I did not speak out
Because I was not a Jew

Then they came for me
And there was no one left
To speak out for me

I think it's important to consider the general principles in question even if the particular instrumental claim 'defending accused witches doesn't do as much good, as you would in expectation be prevented from doing via your work on slavery if you defended accused witches.'

This seems to imply some general principles which don't seem that attractive, i.e. "Don't speak out against/defend against/protest one injustice if you think it will get in the way of working on injustices you care about more.'

This seems like the kind of violation of commonsense morality in the name of utilitarian instrumental goals that the EA community generally warns against. (I also worry that this specific violation of normal moral obligations like 'defend the innocent' 'speak the truth', makes it more likely that people will generally violate such norms in pursuit of their utilitarian goals).

This stance also seems quite shaky, since it seems like we would not generally support such reasoning if the cases were changed just a little bit e.g.:

"We should not speak out against slavery, because it would get in the way of our important anti-poverty work."

"We should not defend or associate with controversial _racial justice activists_, because it will reduce our other EA work."

This also seems bad from a reciprocity standpoint i.e. if slavery activists don't defend or associate with witch defenders, then witch defenders, by the same token may not defend or associate with slavery activists (and so on for other controversial groups). These reciprocity considerations might apply either directly and instrumentally or indirectly via defending the general norm.

Your position also seems even more extreme than how I described it above at points, i.e. "it seems much better for most people in the community to watch what they say in public somewhat, _be careful with their public associations_, and _minimize public contact with any associations that could be seen as potentially problematic_." This goes beyond merely not publicly defending groups. Add "minimiz[ing] public contact" with the groups I gave as examples above and this position seems even more problematic.

That said I think one part of your somewhat concessive, but somewhat ambiguous final paragraph is potentially true:

Individuals can do so... but doing so as a group is a dangerous correlated risk to the movement.

I think it's good to grant that individuals can stand up for accused individuals. I still think that a statement warning off EAs "as a group" is potentially problematic, because this could mean "It's OK for a small number of EAs to do this but not too many", which seems as objectionable as "It's OK for a small number of EAs to publicly oppose slavery, but not too many." But if "as a group" meant "The EA community shouldn't make official public statements as a whole on the political debates of the day or on other controversial issues, and nor should official EA orgs' (which I don't think was your intended meaning), then I would agree with this principle.

I thought it worth pointing out that this statement from one of your comments I mostly agree with, while I strongly disagree with your main post. If this was the essence of your message, maybe it requires clarification:

"Politics is the mind killer." Better to treat it like the weather and focus on the things that actually matter and we have a chance of affecting, and that our movement has a comparative advantage in.

To be clear, I think justice does actually matter, and any movement that would look past it to “more important” considerations scares me a little, but I strongly agree with the “weather” and “comparative advantage” parts of your statement. We should practice patience and humility. By patience I means not jumping into the hot topic conversation of the day, no matter how heated the debate. Humility means recognizing how much effort we spend learning about animal advocacy, malaria, X risk factors, etc. That is why we can feel confident to speak/act on them. But this doesn’t automatically transfer to other issues. Merely recognizing how difficult it is to get altruism right, compared to how much ineffective altruism there is, should be a warning signal when we wade out of our domains of expertise.

I think the middle ground here is not to allow people to bully you out of speaking, but to only speak when you have something worth saying that you considered carefully (preferably with some input from peers). So basically, as others have already mentioned: “what would Peter Singer do?”.

Someone said this to me recently:

I think the pace of cancellations and attempted cancellations has been picking up steam and has gotten pretty damn fast very recently. It is possible I am not sufficiently familiar with base rates, but it does seem like the current pace is quite high, so either it has accelerated or people on the peripheries of our community are unusually likely to be targeted. In either case this is cause for concern.
Eg, REDACTED1's firing, REDACTED2 asked to resign his VP position (but he still gets to be faculty), petition to remove REDACTED3 from his honorary titles.

Also lighter stuff like REDACTED4 forced to delete his Twitter account, and different spats within newsrooms, that I'm not following as closely.
In either case, I've updated towards being much more cautious than usual. There are already things I don't talk about, but I've updated to putting "nuanced takes on social justice topics" as on par with "criticizing the Chinese government as a Chinese citizen" for things I don't publicly talk about.
I talked to other people who are online a lot, and some current college students. The general consensus was substantial concern/alarm, and that people who are older and/or less aggressively aligned may have missed.

Coordination infrastructure that would allow sane people to defuse runaway information cascades in generality would be very valuable to discover, unless physics doesn't allow it.

Can any historically and/or sociologically familiar people comment as to the ability for the witch hunts to have been stopped by counterfactually motivated and capable parties, and what order of magnitude of motivation and capability might have stopped it?

I'd be very excited to know of historically successful examples of this.

Perhaps the reason we don’t know of them offhand is that preventing big obscure potential harm never gets much status among our species.

I received this as a private message. It was unclear to me if it was initially intended as a comment, but I asked and they gave permission for me to do this:

I'm quite bothered by the implicit assumption that this is in fact a cultural revolution. I think the degree to which people will find supporting Hsu offensive is completely dwarfed by how offensive people will find offshoots of the Floyd protests as equivalent to the Cultural Revolution. As noted in https://www.scottaaronson.com/blog/?p=4859, there are steel man versions of the Hsu complaint in particular (though I don't personally have a view).
So if you're trying to preserve EA credibility, implicitly saying we're living through a Cultural Revolution could be extremely damaging.
Should be "is completely dwarfed by offensive people will find the idea that offshoots of the Floyd protests are equivalent to the Cultural Revolution".
One can defend or not defend Hsu (again, I haven't looked at it enough to have a personal view) without making extreme claims about Cultural Revolutions, and both routes (defending or not defending) seem *way* better than crying Cultural Revolution.
If it isn't clear, I'm biased here, in that I personally find the Cultural Revolution claims offensive.

First of all, I sincerely apologize for any offense it may have caused. For what it's worth:

1. I obviously do not think we're living in a cultural revolution yet. The amount of harm caused by the 1960s cultural revolution to lives and livelihoods is severe, and the existing job losses and social disapproval caused by SJ actors is very minor in the grand scheme of things.

2. I do see how this comparison can be offensive. I'm not sure how else to disseminate my worries accurately without causing offense.

3. I personally think we're obviously not in the cultural revolution now, but that there's a moderately high probability that we are on that trajectory (over 1% seems intuitively defensible, my true probability is probably like 7%)

4. I think if we are on the trajectory of a cultural revolution, calling it early will definitely be wrong-think. It's unclear to me how to think about this. My guess is that if we're 10 years away, this weird claim will be swept under the rug relative to other offenses, and the positive benefit of open communication now + creating norms around minimally invasive self-censorship will be helpful. If on the other hand, cultural revolution-like activities are within the 1-5 year timescale, using such language will be too "on the nose" and this post itself is a structural risk...

5. One possibility is to move all conversations about this offline and try to approach EA leaders in person, which the pandemic certainly makes difficult.

6. One thing I did not mention in my post but those seem quite relevant is that conditional upon a cultural revolution-like event happening, it's hard to predict which side will launch a cultural revolution. This is also why I think we should not go out of our way to support SJ as a movement (though individuals can make their decisions based on their conscience or individual bids). The rise of various right-wing demagogues in the West also looks quite dangerous to me, and would be a structural risk to the movement as well, though my personal guess is that right-wing demagoguery tends to be parochial, so is less dangerous to a geographically mobile movement. Another thing driving my thoughts here is that we're too much culturally left at our roots anyway (at least by American/Anglophone standards), so surviving a right wing dictatorship is likely a lost cause.