I am worried.

The last month or so has been very emotional for a lot of people in the community, culminating in the Slate Star Codex controversy of the past two weeks. On one side, we've had multiple posts talking about the risks of an incipient new Cultural Revolution; on the other, we've had someone accuse a widely-admired writer associated with the movement of abetting some pretty abhorrent worldviews. At least one prominent member of an EA org I know, someone I deeply respect, deleted their Forum account this week. I expect there are more I don't know about.

Both groups feel like they and their sacred values are under attack. Both groups are increasingly commenting anonymously or from throwaway accounts, and seeing their comments mass-downvoted and attacked. It's hard not to believe we're at risk of moving in a much more unpleasant direction.

I'm not going to pretend I don't have my own sympathies here. I've definitely been feeling a lot more tribal than usual lately, and it's impaired my judgement at a couple of points. But I think it's important to remember that we are all EAs here. We're here because we endorse, in one form or another, radical goodwill towards the rest of the world. I have never been among a group of people at once more dedicated to the wellbeing of others and the pursuit of the true. I admire you all so much.

Many people here feel their membership in EA is a natural outgrowth of their other beliefs. Those other beliefs can differ quite a lot from person to person. But I implore all of you to see the common good in each other. There are many people in EA who hold beliefs and political opinions significantly different from mine. But with very few exceptions they have proven among the most open, honest and charitable proponents of those views I've ever encountered. We can have the conversations we need to have to get through this.

The Forum is probably not the place to have those conversations. Too many people are too worried about their words being used against them to speak too openly under their own names – an indictment of our broader culture if ever there was one. But you can reach out to each other! Schedule calls! Now is a bad time to not be able to have in-person conferences, but it's not impossible to make up the difference if we try.

(And on the Forum, please try to be charitable, even if your conversation partner is falling short of the standards you would set yourself. Strive to raise the tone of the conversation, not just to match it. I have sometimes failed in this recently.)

I'll start. If I say something on the Forum you disagree with, and you don't think it's productive to discuss it in comments, please feel free to reach out to me by private message, or schedule a call with me here.

Our epistemic norms are precious. So are our norms of compassion, justice, and universal goodwill. We need both to achieve the lofty goals we've set ourselves, and we need each other.

Comments27


Sorted by Click to highlight new comments since:

Re "Cultural Revolution" comparison, let me put it this way: I'm a naturalized citizen of the US who has lived here for 30+ years, and recently I've spent 20+ hours researching the political climate and immigration policies of other countries I could potentially move to. I've also refrained multiple times from making a public comment on a topic that I have an opinion on (including on this forum), because of potential consequences that I've come to fear may happen in a few years or decades later. (To be clear I do not mean beatings, imprisonment, or being killed, except as unlikely tail risks, but more along the lines of public humiliation, forced confessions/apologies, career termination, and collective punishment of my family and associates.)

If there are better or equally valid historical analogies for thinking about what is happening and what it may lead to, I'm happy to hear them out. But if some people are just offended by the comparison, I can only say that I totally understand where they're coming from.

I basically think the cultural revolution, witch hunts, and people being denounced as heretics are all equally good (and equally bad) comparisons. All three are examples of top-down, peer-enforced violence against an outgroup who can be accused for no reason.

The main differences I see here are that this doesn't seem really top down (neither the Republican party nor the church seem fond of cancel culture) and this has more to do with reputation/livelihood than physical harm. (I have more thoughts about why they're different but I'm self-censoring to be more convincing and because people are mean to me on the EA Forum e.g. when I suggest sexism exists in America.)

I suspect there are many other historical examples of people demonizing the outgroup as well.

witch hunts [...] top-down

The vast majority of witch hunts were not top-down as far as I remember from my cursory reading on this topic. They were usually driven by mobs and bottom-up social activity, with the church or other higher institutions usually trying to avoid getting involved with them.

Thanks Habryka. In that case, I take it back - witch hunts are a better analogy than the cultural revolution.

EDIT: I also prefer any analogy which emphasizes continuity. I don't think people being "cancelled" this week face particularly different circumstances than Monica Lewinsky; I dislike analogies that suggest there's been a sudden change in how American society behaves.

The witch hunts were sometimes endorsed/supported by the authorities, and other times not, just like the Red Guards:

Under Charlemagne, for example, Christians who practiced witchcraft were enslaved by the Church, while those who worshiped the Devil (Germanic gods) were killed outright.

By early 1967 Red Guard units were overthrowing existing party authorities in towns, cities, and entire provinces. These units soon began fighting among themselves, however, as various factions vied for power amidst each one’s claims that it was the true representative of Maoist thought. The Red Guards’ increasing factionalism and their total disruption of industrial production and of Chinese urban life caused the government in 1967–68 to urge the Red Guards to retire into the countryside. The Chinese military was called in to restore order throughout the country, and from this point the Red Guard movement gradually subsided.

I would say the most relevant difference between them is that witch hunts were more "organic", in other words they happened pretty much everywhere where people believed in the possibility of witches (which was pretty much everywhere period), whereas the Cultural Revolution was driven/enabled entirely by ideology indoctrinated by schools, universities, and mass media propaganda.

On one side, we've had multiple posts talking about the risks of an incipient new Cultural Revolution; on the other, we've had someone accuse a widely-admired writer associated with the movement of abetting some pretty abhorrent worldviews.

I'm not sure what contrast you are trying to make here:

  • The first post argues that, while SJ cancellations are a problem, we should not fight back against them because it would be too expensive. The second post agrees that SJ cancellations are a problem that could become much worse, but argues we should try to do something about it.
  • The third post is an example of an attempted SJ cancellation, criticizing the community for being insufficiently zealous in condemning the outgroup. (It was downvoted into oblivion for being dishonest and nasty).

The first two are motivated by concern over the rise of bullying and its ability to intimidate people from communicating honestly about important issues, and discuss what we should do in response. The third article is... an example of this bad behaviour?

For the symmetry argument you want to make, it seems like you would need a right-wing version of the third post - like an article condemning the community for not doing enough to distance itself from communists and failing to constantly re-iterate its support for the police. Then it would make sense to point out that, despite the conflict, both sides were earnestly motivated by a desire to make the world a better place and avoid bad outcomes, and we should all remember this and respect each other.

But to my knowledge, no such article exists, partly because there are very few right-wing EAs. Rather, the conflict is between the core EA movement of largely centre-left people who endorse traditional enlightenment values of debate, empiricism and universalism, vs the rise of extreme-left 'woke' culture, which frequently rejects such ideals. Accusing the moderate left of being crypto-fascists is one of the standard rhetorical moves the far-left uses against the centre-left, and one they are very vulnerable to.


Note that I removed the link to the attack article because I think it is probably a violation of implicit forum norms to promote content with more than 100 net downvotes. If it hadn't been linked in this article I would not have come across it, which is probably desirable from the perspective of the moderators and the community.


Edit: the OP was edited between when I opened the page and starting writing this comment, and when I hit publish; at the request of the author I have updated the quote to reflect his edits, though I think this makes the comment a little harder to understand.

This comment does a good job of summarising the "classical liberal" position on this conflict, but makes no effort to imagine or engage with the views of more moderate pro-SJ EAs (of whom there are plenty), who might object strongly to cultural-revolution comparisons or be wary of SSC given the current controversy.

As I already said in response to Buck's comment:

I agree that post was very bad (I left a long comment explaining part of why I strong-downvoted it). But I think there's a version of that post, that is phrased more moderately and tries harder to be charitable to its opponents, that I think would get a lot more sympathy from the left of EA. (I expect I would still disagree with it quite strongly.)

As you say, there aren't many right-wing EAs. The key conflict I'm worried about is between centre/centre-left/libertarian-leaning EAs and left-wing/SJ-sympathetic EAs[1]. So suggesting I need to find a right-wing piece to make the comparison is missing the point.

(This comment also quotes an old version of my post, which has since been changed on the basis of feedback. I'm a bit confused about that, since some of the changes were made more than a day ago – I tried logging out and the updated version is still the one I see. Can you update your quote?)


    1. I also don't want conservative-leaning EAs to be driven from the movement, but that isn't the central thing I'm worried about here. ↩︎

Buck
11
0
0

What current controversy are you saying might make moderate pro-SJ EAs more wary of SSC?

Buck
23
0
0

Edit: the OP has removed the link I’m complaining about.

I think it's quite bad to link to that piece. The piece makes extremely aggressive accusations and presents very little evidence to back them up; it was extensively criticised in the comments. I think that that piece isn't an example of people being legitimately concerned, it was an example of someone behaving extremely badly.

Another edit: I am 80% confident that the author of that piece is not actually a current member of the EA community, and I am more than 50% confident that the piece was written mostly with an intention of harming EA. This is a lot of why I think it's bad to link to it. I didn't say this in my initial comment, sorry.

I don't have strong views on this, but I'm curious why you think linking to instances of bad behavior is bad. All the reasons I can think of don't seem to apply here - e.g. the link clearly isn't an endorsement, and it's not providing resources e.g. through increased ad revenues or increasing page rank.

By contrast, I found the link to the post useful because it's evidence about community health and people's reactions: the fact that someone wrote that post updated me toward being more worried (though I think I'm still much less worried than the OP, and for somewhat different reasons). And I don't think I could have made the same update without skimming the actual post. I.e. simply reading a brief description like "someone made a post saying X in a way I think was bad" wouldn't have been as epistemically useful.

I would guess this upside applies to most readers. So I'm wondering which countervailing downsides would recommend a policy of not linking to such posts.

I have two complaints: linking to a post which I think was made in bad faith in an attempt to harm EA, and seeming to endorse it by using it as an example of a perspective that some EAs have.

I think you shouldn't update much on what EAs think based on that post, because I think it was probably written in an attempt to harm EA by starting flamewars.

EDIT: Also, I kind of think of that post as trying to start nasty rumors about someone; I think we should generally avoid signal boosting that type of thing.

Thanks for explaining. This all makes some sense to me, but I still favor linking on balance.

(I don't think this depends on what the post tells us about "what EAs think". Whether the author of the post is an EA accurately stating their views, or a non-EA trying to harm EA, or whatever - in any case the post seems relevant for assessing how worried we should be about the impacts of certain discussions / social dynamics / political climate on the EA community.)

I do agree that it seems bad to signal boost that post indiscriminately. E.g. I think it would be bad to share without context on Facebook. But in a discussion on how worried we should be about certain social dynamics I think it's sufficiently important to look at examples of these dynamics.

EDIT: I do agree that the OP could have done more to avoid any suggestion of endorsement. (I thought there was no implied endorsement anyway, but based on your stated reaction and on a closer second reading I think there is room to make this even clearer.) Or perhaps it would have been best to explicitly raise the issue of whether that post was written with the intent to cause harm, and what this might imply for how worried we should be. Still, linking in the right way seems clearly better to me than not linking at all.

I'm still pretty sceptical that the post in question was deliberately made with conscious intention to cause harm. In any case, I know of at least a couple of other EAs who have good-faith worries in that direction, so at worst it's exacerbating a problem that was already there, not creating a new one.

(Also worth noting that at this point we're probably Streisanding this dispute into irrelevance anyway.)

I agree that post was very bad (I left a long comment explaining part of why I strong-downvoted it). But I think there's a version of that post, that is phrased more moderately and tries harder to be charitable to its opponents, that I think would get a lot more sympathy from the left of EA. (I expect I would still disagree with it quite strongly.)

I think there's a reasonable policy one could advocate, something like "don't link to heavily-downvoted posts you disagree with, because doing so undermines the filtering function of the karma system". I'm not sure I agree with that in all cases; in this case, it would have been hard for me to write this post without referencing that one, I think the things I say here need saying, and I ran this post by several people I respect before publishing it.

I could probably be persuaded to change that part given some more voices/arguments in opposition, here or in private.

(It's also worth noting that I expect there are a number of people here who think comparisons of the current situation to the Cultural Revolution are quite bad, see e.g. here.)

Buck
26
0
0

I think that both the Cultural Revolution comparisons and the complaints about Cultural Revolution comparisons are way less bad than that post.

I agree that comparisons to the Cultural Revolution are bad. As someone with family members who were alive during the Chinese Cultural Revolution (one of whom died because of it), I'm pretty unsympathetic to people saying cancel culture is the new cultural revolution.

Buck
34
0
0

Many other people who are personally connected to the Chinese Cultural Revolution are the people making the comparisons, though. Eg the EA who I see posting the most about this (who I don't think would want to be named here) is Chinese.

Yes, I've spoken in depth with one. I don't believe he shouldn't be able to make the comparison, but we agreed the comparison has no predictive power and is one of many comparisons that could be made (eg you could probably just as easily compare the current situation to witch hunts which is a more common analogy in Western circles).

We also agreed there are dissimilarities (eg in this situation in America there's no state backing of anyone being targeted; in fact, social justice protestors are much more likely to be injured or killed by the state than the people they oppose)

(I have now cut the link.)

Buck
21
0
0
culminating in the Slate Star Codex controversy of the past two weeks

I don't think that the SSC kerfuffle is that related to the events that have caused people to worry about cultural revolutions. In particular, most of the complaints about the NYT plan haven't been related to the particular opinions Scott has written about.

"Culminating" might be the wrong word, I agree the triggering event was fairly independent.

But I do think people's reactions to the SSC kerfuffle were coloured by their beliefs about the previous controversy (and Scott's political beliefs), and that it contributed to the general feeling I'm trying to describe here.

So far the comments here have overwhelmingly been (various forms of) litigating the controversy I discuss in the OP. I think this is basically fine – disagreements have all been civil – but insofar as there is still interest I'd be keen to hear people's thoughts on a more meta level: what sorts of things could we do to help increase understanding and goodwill in the community over this issue?

Thanks for making this post Will -

I'll admit that since the SSC stuff happened, I've been feeling a lot further from EA (not necessarily the core EA ideas, but associating with the community or labeling myself as an EA), and I felt genuinely a bit scared learning through the SSC stuff about ways in which the EA community overlaps with alt-right communities and ideas, etc. I don't know what to make of all of it, as everyone I work with in EA regularly are wonderful people who care deeply about making the world better. But I feel wary and nervous about all this, and I've also been considering leaving the forum / FB groups just to have some space to process what my relationship with EA ought to be external to my work.

I see a ton of overlap between EA in concept and social justice. A lot of the dialogue in the social justice community focuses on people reflecting on their biases, and working to shift out of a lens on the world that introduces some kinds of biases. And, broadly folks working on social justice issues are trying to make the world better. This all feels very aligned with EA approaches, even if the social justice community is working on different issues, and are focused on different kinds of biases.

I've heard (though don't know much about it), that to some extent EA outreach organizations stopped focusing on growth and has focused more on quality in some sense a few years ago. I wonder if doing that has locked in whatever norms were present in the community prior to that, and that's ended up unintentionally resulting in a fair amount of animosity toward ideas or approaches to argument that are outside the community's standards of acceptability? I generally think that one of the best ways to improve this issue is to invest heavily in broadening the community, and part of that might require work to make the community more welcoming (and not actively threatening) to people who might not feel welcome here right now.

Thanks, Abraham. It's really valuable to get these perspectives, and it's helpful to get people discussing these issues under their real names where they feel they can. I agree that there is a lot of overlap between the impulses that lead people into EA and those that lead many people into SJ.

I'm too tired right now to respond to this in the depth and spirit it deserves – I'll try and do so tomorrow – so just wanted to flag that this is a positive and valuable contribution to the discussion. I hope any responses to it in the meantime are made in the same spirit.

This post does a much better job that I could manage of explaining how I've felt recently. Thank you for writing it.

Something I've been doing just a bit lately which seems to be working surprisingly well so far: If I see a polarizing discussion on EA Facebook, and someone writes a comment in a way which seems needlessly combative/confrontational to me, I add them as a friend and private message them trying to persuade them to rewrite their comment.

My general model here is that private 1-on-1 communication is much higher bandwidth, less ego-driven, and more amenable to the resolution of misunderstandings etc. However it's not nearly as scalable (in terms of the size of the audience reached) as a forum discussion is. But private 1-on-1 communication where you try to persuade someone to change their forum writing gets you the best of both worlds.

Another model is that combativeness tends to beget combativeness, so it's high-leverage to try & change the tone of the conversation as early as possible.

Curated and popular this week
Sam Anschell
 ·  · 6m read
 · 
*Disclaimer* I am writing this post in a personal capacity; the opinions I express are my own and do not represent my employer. I think that more people and orgs (especially nonprofits) should consider negotiating the cost of sizable expenses. In my experience, there is usually nothing to lose by respectfully asking to pay less, and doing so can sometimes save thousands or tens of thousands of dollars per hour. This is because negotiating doesn’t take very much time[1], savings can persist across multiple years, and counterparties can be surprisingly generous with discounts. Here are a few examples of expenses that may be negotiable: For organizations * Software or news subscriptions * Of 35 corporate software and news providers I’ve negotiated with, 30 have been willing to provide discounts. These discounts range from 10% to 80%, with an average of around 40%. * Leases * A friend was able to negotiate a 22% reduction in the price per square foot on a corporate lease and secured a couple months of free rent. This led to >$480,000 in savings for their nonprofit. Other negotiable parameters include: * Square footage counted towards rent costs * Lease length * A tenant improvement allowance * Certain physical goods (e.g., smart TVs) * Buying in bulk can be a great lever for negotiating smaller items like covid tests, and can reduce costs by 50% or more. * Event/retreat venues (both venue price and smaller items like food and AV) * Hotel blocks * A quick email with the rates of comparable but more affordable hotel blocks can often save ~10%. * Professional service contracts with large for-profit firms (e.g., IT contracts, office internet coverage) * Insurance premiums (though I am less confident that this is negotiable) For many products and services, a nonprofit can qualify for a discount simply by providing their IRS determination letter or getting verified on platforms like TechSoup. In my experience, most vendors and companies
 ·  · 4m read
 · 
Forethought[1] is a new AI macrostrategy research group cofounded by Max Dalton, Will MacAskill, Tom Davidson, and Amrit Sidhu-Brar. We are trying to figure out how to navigate the (potentially rapid) transition to a world with superintelligent AI systems. We aim to tackle the most important questions we can find, unrestricted by the current Overton window. More details on our website. Why we exist We think that AGI might come soon (say, modal timelines to mostly-automated AI R&D in the next 2-8 years), and might significantly accelerate technological progress, leading to many different challenges. We don’t yet have a good understanding of what this change might look like or how to navigate it. Society is not prepared. Moreover, we want the world to not just avoid catastrophe: we want to reach a really great future. We think about what this might be like (incorporating moral uncertainty), and what we can do, now, to build towards a good future. Like all projects, this started out with a plethora of Google docs. We ran a series of seminars to explore the ideas further, and that cascaded into an organization. This area of work feels to us like the early days of EA: we’re exploring unusual, neglected ideas, and finding research progress surprisingly tractable. And while we start out with (literally) galaxy-brained schemes, they often ground out into fairly specific and concrete ideas about what should happen next. Of course, we’re bringing principles like scope sensitivity, impartiality, etc to our thinking, and we think that these issues urgently need more morally dedicated and thoughtful people working on them. Research Research agendas We are currently pursuing the following perspectives: * Preparing for the intelligence explosion: If AI drives explosive growth there will be an enormous number of challenges we have to face. In addition to misalignment risk and biorisk, this potentially includes: how to govern the development of new weapons of mass destr
jackva
 ·  · 3m read
 · 
 [Edits on March 10th for clarity, two sub-sections added] Watching what is happening in the world -- with lots of renegotiation of institutional norms within Western democracies and a parallel fracturing of the post-WW2 institutional order -- I do think we, as a community, should more seriously question our priors on the relative value of surgical/targeted and broad system-level interventions. Speaking somewhat roughly, with EA as a movement coming of age in an era where democratic institutions and the rule-based international order were not fundamentally questioned, it seems easy to underestimate how much the world is currently changing and how much riskier a world of stronger institutional and democratic backsliding and weakened international norms might be. Of course, working on these issues might be intractable and possibly there's nothing highly effective for EAs to do on the margin given much attention to these issues from society at large. So, I am not here to confidently state we should be working on these issues more. But I do think in a situation of more downside risk with regards to broad system-level changes and significantly more fluidity, it seems at least worth rigorously asking whether we should shift more attention to work that is less surgical (working on specific risks) and more systemic (working on institutional quality, indirect risk factors, etc.). While there have been many posts along those lines over the past months and there are of course some EA organizations working on these issues, it stil appears like a niche focus in the community and none of the major EA and EA-adjacent orgs (including the one I work for, though I am writing this in a personal capacity) seem to have taken it up as a serious focus and I worry it might be due to baked-in assumptions about the relative value of such work that are outdated in a time where the importance of systemic work has changed in the face of greater threat and fluidity. When the world seems to
Recent opportunities in Building effective altruism
32
CEEALAR
· · 1m read