Update (3 may 2022): I may have underestimated difficulty in identifying which opinions to present, how to objectively collect and aggregate data of EA opinions, and how to present them in a way that does not lose nuance. I might also be overestimating the benefit of presenting such data. Might write another post on this, after spending more time on it.

Disclaimer: I have not spent too much time on this post. Post may not be 100% coherent or provide the best model to think about this topic. This is a short post.

Mostly I just wished to initiate discussion on this topic.


Motivation for the post

Recent tweets by Dr Timnit Gebru criticising EA and longtermism specifically. [1] [2] [3]

Accusations run primarily along leftist lines of thought. EA and longtermism are claimed:

 - to prioritise interests of rich and powerful, over common people

 - to be divorced from the lived experience of common people

 - to follow colonialist lines of thought

 - to instrumentalise common people in the pursuit of ends

 - to launder and rationalise biases that the rich and powerful already have, and actions they already wish to commit

 - to be cultish / fanatical


Why are accusations of implicit bias problematic?

IMO accusations of implicit bias are very easy to make, and very hard to argue against. I personally think Dr Gebru overestimates amount of bias towards certain lines of thought, but this is hard to objectively prove. This difficulty in disproving bias is a significant part of politics in general. One side can claim 20% of your group A is into objectionable thing XYZ, and the other side can claim no it's actually more like 5%, and the only way to get to the truth is to read through every single piece of content published by every person in the group A, then aggregate and analyse for hidden motivations, things not said, and so on. In practice almost no one does this, and both sides keep shouting their claims louder and louder instead.


When I say "implicit bias" here I mean it in a very general sense of people in a group A being sympathetic to a stance XYZ that is not explicitly stated, but most people in the group believe it anyway. By this definition, implicit bias is not intrinsically bad, whether it is bad in any given circumstance is a separate question. There are other ways one can define bias, which I am not considering here.


What implicit biases does EA actually have?


Here's some I've observed:

 - Majority of EAs are sympathetic to capitalist mode of production, and economically right-wing lines of thought. Sure, some are far more economically left, but they're in minority and usually don't post much or represent the movement. EA movement as a whole does not have consensus that will enable it to push against capitalist mode of production.

 - Many EAs are not in favour of many forms of systemic change, that are espoused by people in other circles. EAs are in favour of systemic changes in some narrow areas, and might in future expand their attention on what changes they wish to bring. But as of today, EA movement consensus is not in favour of major systemic changes.

 - EA more often than not seeks to ally with the existing rich and powerful in society, instead of attempt to replace them. Many people (non-EAs) have a gut reaction against this.

 - (insert more here?)


Above list of biases are what will be most apparent to someone from a leftist community or movement encountering EA for the first time. Entirely different lists of biases might be apparent to other communities - be it 1st world Christians, or people in China and Russia, and so on.


Why should these biases be made explicit?


By explicit, I mostly mean they should be made more prominent, included in frontpages of EA websites, introductory sections of EA blogs and EA books, and so on. As opposed to being buried somewhere in the forum.


Main reason to make biases explicit is that a significant number of people who encounter EA for the first time are already involved in other social movements that aim to do good for people. They will tend to view EA through the lenses of the movements or communities they are already part of, whether you ask them to do this or not.


I feel like EA today often tries to sidestep political questions, remain diplomatic and thereby hope for multipolar consensus on the narrow areas it wishes to act - such as x-risk reduction and global health and poverty. This might (?) be an insufficient approach, especially if other communities wish to escalate conflict anyway. Working towards mutually beneficial ends only works if both sides are willing to not escalate conflict.


By explicitly analysing EA through the lens of other social movements and popular stances, we will reduce the burden for people entering from those movements to understand EA, both its explicit statements and its implicit biases. IMO people often hunt for implicit bias when encountering new communities, whether you want them to or not.

Reducing this burden also makes it harder for people to misrepresent EA stances accidentally, or for them to take seriously deliberate (malicious) misinterpretations of EA published by others.


Reducing burden is important because people often allocate very little time to analysing claims seriously, hunting for implicit biases and so on. Especially when they casually encounter something novel like EA for the first time. It is a tall order to expect everyone to move past this and spend significant time engaging with object-level claims EA makes, before forming an opinion.



Do share your thoughts. I wished to initiate a discussion. Please let me know if I can constrain the discussion or make it more meaningful in any way.


18 comments, sorted by Click to highlight new comments since: Today at 9:48 PM
New Comment

I wouldn’t call these biases. I was a typical lefty before I joined EA but when I learned about economic history and microeconomics I heavily updated towards what you call “capitalist mode of production”. In hindsight, I knew very little about economics and just repeated the same ideas that my leftist friends had. Knowing more has made me better at criticizing certain aspects of capitalism but also more appreciative of others.

The bias, in my opinion, should not be stated as the outcome (“believe in capitalism”) but what directs one to arrive at such a position (e.g. risk aversion wrt to bad publicity, trust in authority etc).

You're right "bias" may not have been the best word for what I wanted to say, it's somewhat negatively-coded.

I'd be keen to know more what you mean by "what directs one to arrive at such a position".

A better word for this might be "priors" - it indicates that there's already existing inertia towards these positions in EA, but doesn't carry the connotation of this being a bad thing. 

That said, it's not 100% accurate - a "prior" is an assumption of the world that comes from deep-seated beliefs or upbringing, whereas many of us in EA who have certain existing stances have done so via reasoning.

Hmm, maybe I can call it a prior but from the view of the outsider. Say an outsider with far-left stances comes and talks to 5 EAs, and finds 1 person who shares their far-left opinions, and 1 person who has stances they find repulsive. Or maybe it's not 5 EAs in person they meet, but 5 articles or blog posts they read. They're now going to assume that roughly 20% of EAs are far-left and 20% have opinions they find repulsive, by extrapolating their "prior".

(And sure the ideal thing to do is talk to more than 5 people but not everyone has time or interest for that, which is why I might want EAs to instead present this data to them explicitly.)

I think this idea bears some merit in itself, but it would be a lot more complex in practice I think. Some other replies have covered that and I have nothing useful to add, so I won't. One thing I would say is that we have to acknowledge that some of the criticism you list is pretty genuine. We aren't a perfect community and this does impact our activities. Some examples from what Timnit et al discussed on Twitter:

 'EA and longtermism are claimed to be divorced from the lived experience of common people'

The majority  of EA's base is in the world's 'elite' academic centres. This is for obvious reasons in that it is the best bang-for-buck place to find reliable quality of academic talent. However, it also means that EA generally picks from the most priviliged among society. For example a 2014 Cherwell report (bit dated, but relevant) found that the national average of students having to work while they study to support themselves is 57%, 90% of whom work 20+ hours per week. At Oxford University, only 20% of students work and 'the majority' according to the report work under 5 hours per week. This is because the students there are generally from much wealthier backgrounds, so fewer are forced to find part-time employment to fund themselves. This may also be due to other factors, such as a lack of work nearby (eg Oxford isn't a huge city), but this is unlikely to be the main factor.  Also, most EA orgs are based in the most expensive places in the UK (London, Oxford, Cambridge) which means that people with more financial runway can afford to take more opportunities.  Is this a bad thing? Not in itself. But it does mean that EA's membership demographic may not reflect the demographic makeup of society at large. It's a genuine criticism. To be fair, the EA community does try hard to correct this, but it's just looking at things as they are. When Timnu makes this point, she doesn't make it baselessly.

EA and longtermism are claimed to follow colonialist lines of thought

An example of this is that influential EA members have in the past published pieces that to the general public are pretty horrifying, and I can't for the life of me think why people are surprised at backlash. People are like "omg we're being dragged publicly on Twitter" and you read the Tweet sources and you're like "well...yeah".

 For example, posing the idea that some humans are more valuable than others. Hot takes like "Hey, maybe disabled people don't have the same right to be alive as non-disabled people because they're less useful" are going to generate some aggro. And in my opinion rightly so.  Everyone is entitled to freedom of speech, but the community has to accept the blowback from that policy.

I've often thought that EA's lack of centralised reaction to things like this has been a bit of an issue. Like when there were protests against EA regarding some views the public found objectionable, EA just kind of ignored it. But the media didn't.  Then again, creating some kind of EA PR dept could be seen as having something to hide, and EA has always liked a full transparency policy. 

You're right in that it would be useful to say up front "Hey, we're not perfect, here's the values we believe in but not all members agree and that's okay". But what are those values, and who decides? And would such an intro provide a cooling effect to philosophical thought? Or would it help, considering it matters just as much what people perceive your community to be as it does what your community actually is? 

This was a good post, and raises good questions, I apologise if my answer/input isn't helpful. It's a good issue to raise because if it ever reaches the point where calling yourself a Longtermist or whatever is career suicide, the philosophy dies - for reasons entirely unrelated to the philosophy itself. It's worth considering.


Thanks for replying!

You're right there's some difficulty in figuring out which opinions do we want to poll EAs for and present data. And how much would outsiders trust that data, as opposed to perceptions they may have obtained through other means. I didn't fully think this through when I was posting.

re: PR department

I think it would be useful to have PR, although I'm not entirely sure what policies they should take, or how insider versus outsider conversations should be fragmented. Some people have raised issue with the EA forum too because it is too open to outsiders.

re: Dr Gebru's opinions

Yeah if you take a very generous interpretation you can see some of it is based in truth. Although I personally still think it has more to do with her lens of viewing the world combined with assigning way too high priors on how much bias exists in the community. Hence I thought being upfront about how much and what kinds of bias exist might help. Am unsure too. As for her lens itself, it's not one I fully ascribe to but it's a lens a lot of people have, and hence imo it's worth engaging with people from inside of their own perspective or lens.

No problem!

Absolutely, I can see what you mean. Personal lens counts for a lot and people can run away with ideas of badness. Things are rarely as bad as people criticise them for being, and EA is no different. Yeah it has a few issues here and there but the media and Twitter can often make these issues look far, far worse than they actually are. I can totally see where you're coming from.

It would be nice to learn what movements exist besides EA.
It may be beneficial for people to find those movements.
Overviews could be done within EA forum articles first.

One (probably surmountable but non-trivial in my view) problem with this is that once you start trying to draft a statement about exactly what attitude we have to capitalism/economics you'll start to see underlying diversity beneath "don't want to abolish capitalism." This, I predict, will make it trickier than it seems to come up with anything clear and punchy that everyone can sign onto. In particular, leaving aside for a minute people with actually anti-capitalist views, you'll start to see a split between people with actual neo-liberal or libertarian economic views they are confident of who would give ringing endorsements of capitalism, people who are just skeptical that we know whether or not "capitalism is good" is true or regard it as too vague to be worth assessing, and people who simply don't think political activism is as good a use of the marginal dollar as other stuff, because they think it's usually not very neglected or tractable. For example, I'd hesitate to sign onto "we are pro-capitalist", but not because I'm anti, so much as because I have a mixture of the second 2 positions.

Incidentally, for what it's worth, I strongly suspect that in developed countries with a traditional "party of business" and "party of labour", a somewhat higher % of EAs in those countries vote for the "labour" one. I actually think that is consistent with what you've said about community attitudes to capitalism. But if I'm correct about it, I think saying were pro-capitalis economic rightists will actually confuse at least some outsiders about where we stand on a measure of political affiliation they really care about. (I'm thinking of people on the centre-left here primarily, rather than more radical socialists.)

I see. I think your answer exactly as you've said it, would be useful to add to intro pages. Like maybe a survey result can be added.

Also yeah I don't think this should be a defining feature of EA or something to rally under, but it is important info that could be presented to someone coming to EA first time who is already looking for this info.

I think EAs are in favour of systemic change. This old article gives a list, which I guess will be much longer now. https://forum.effectivealtruism.org/posts/5XeCA5gKbMakAskLy/effective-altruists-love-systemic-change 

Thank you for this! This makes sense, and this could be added to intro pages.

I also still think it's useful to list out (in intro pages) systemic changes that other movements support that we don't support (or atleast don't have consensus support for inside of EA).

I suspect that unfortunately, both in the initial writing of such things and the finding of them, that we'd get more conflict rather than less. I think it will be hard to get EAs to reach consensus on what our biases are, and I'd guess that adversarial people will use that kind of thing as fodder, unfortunately. Maybe there will be people who appreciate learning it and being able to understand EA's role in the intellectual ecosystem, but I don't foresee that doing a lot to reduce friction.

Having more projects in common would serve this goal better, I'd guess, but that's of course complicated in lots of ways

I see - this is a valid point.

I was thinking of reporting survey results (of EA opinions on non-EA stances) - do you think it is hard to conduct surveys objectively?

Surveys seem valuable, I'd certainly find the results interesting.

In case you haven't seen this, it might address some of (though likely not all of) what you're looking for: https://forum.effectivealtruism.org/posts/LRmEezoeeqGhkWm2p/is-ea-just-longtermism-now-1

A couple thoughts so far, written at 3am so hopefully at least somewhat clear:

  1. The post isn't short :)
  2. Another bias is in favour of technocracy over democracy. "Impact is calculated through careful analysis, and this analysis can be done by anyone, so the recipients do not need to govern it or give inputs to it." I do not mean by this that anyone in EA would stand behind this quote as written (though some might), but rather that we're biased in this direction.
  3. These biases can be viewed through more than one lens: on the other hand, this is what a newcomer should expect to currently find in EA circles; on the other hand they aren't sacred, and do not automatically follow from EA principles. Nor are they rigourously argued for. It is rather that no arguments have been put forward convincing enough to move people against them. In other words, they may change, and we can work to change them if we don't like them, as long as we show changing them will help.
  4. Another kind of "bias" that probably is a core part of EA is pragmatism: projects and ideas have to be assessed on how much they contribute to the end goal (whichever it may be: utility, equity, beauty, etc.) - adopting one part of a theory (or political ideology) does not mean all the rest have to be adopted too.

2 makes sense.

Regarding 3, I completely agree. I think you can present this nuance in the intro pages too. Like here is the current distribution of EA opinions on your favorite movement X, but if you feel you can convince us on X we're open to change.