Update (3 may 2022): I may have underestimated difficulty in identifying which opinions to present, how to objectively collect and aggregate data of EA opinions, and how to present them in a way that does not lose nuance. I might also be overestimating the benefit of presenting such data. Might write another post on this, after spending more time on it.
Disclaimer: I have not spent too much time on this post. Post may not be 100% coherent or provide the best model to think about this topic. This is a short post.
Mostly I just wished to initiate discussion on this topic.
Motivation for the post
Accusations run primarily along leftist lines of thought. EA and longtermism are claimed:
- to prioritise interests of rich and powerful, over common people
- to be divorced from the lived experience of common people
- to follow colonialist lines of thought
- to instrumentalise common people in the pursuit of ends
- to launder and rationalise biases that the rich and powerful already have, and actions they already wish to commit
- to be cultish / fanatical
Why are accusations of implicit bias problematic?
IMO accusations of implicit bias are very easy to make, and very hard to argue against. I personally think Dr Gebru overestimates amount of bias towards certain lines of thought, but this is hard to objectively prove. This difficulty in disproving bias is a significant part of politics in general. One side can claim 20% of your group A is into objectionable thing XYZ, and the other side can claim no it's actually more like 5%, and the only way to get to the truth is to read through every single piece of content published by every person in the group A, then aggregate and analyse for hidden motivations, things not said, and so on. In practice almost no one does this, and both sides keep shouting their claims louder and louder instead.
When I say "implicit bias" here I mean it in a very general sense of people in a group A being sympathetic to a stance XYZ that is not explicitly stated, but most people in the group believe it anyway. By this definition, implicit bias is not intrinsically bad, whether it is bad in any given circumstance is a separate question. There are other ways one can define bias, which I am not considering here.
What implicit biases does EA actually have?
Here's some I've observed:
- Majority of EAs are sympathetic to capitalist mode of production, and economically right-wing lines of thought. Sure, some are far more economically left, but they're in minority and usually don't post much or represent the movement. EA movement as a whole does not have consensus that will enable it to push against capitalist mode of production.
- Many EAs are not in favour of many forms of systemic change, that are espoused by people in other circles. EAs are in favour of systemic changes in some narrow areas, and might in future expand their attention on what changes they wish to bring. But as of today, EA movement consensus is not in favour of major systemic changes.
- EA more often than not seeks to ally with the existing rich and powerful in society, instead of attempt to replace them. Many people (non-EAs) have a gut reaction against this.
- (insert more here?)
Above list of biases are what will be most apparent to someone from a leftist community or movement encountering EA for the first time. Entirely different lists of biases might be apparent to other communities - be it 1st world Christians, or people in China and Russia, and so on.
Why should these biases be made explicit?
By explicit, I mostly mean they should be made more prominent, included in frontpages of EA websites, introductory sections of EA blogs and EA books, and so on. As opposed to being buried somewhere in the forum.
Main reason to make biases explicit is that a significant number of people who encounter EA for the first time are already involved in other social movements that aim to do good for people. They will tend to view EA through the lenses of the movements or communities they are already part of, whether you ask them to do this or not.
I feel like EA today often tries to sidestep political questions, remain diplomatic and thereby hope for multipolar consensus on the narrow areas it wishes to act - such as x-risk reduction and global health and poverty. This might (?) be an insufficient approach, especially if other communities wish to escalate conflict anyway. Working towards mutually beneficial ends only works if both sides are willing to not escalate conflict.
By explicitly analysing EA through the lens of other social movements and popular stances, we will reduce the burden for people entering from those movements to understand EA, both its explicit statements and its implicit biases. IMO people often hunt for implicit bias when encountering new communities, whether you want them to or not.
Reducing this burden also makes it harder for people to misrepresent EA stances accidentally, or for them to take seriously deliberate (malicious) misinterpretations of EA published by others.
Reducing burden is important because people often allocate very little time to analysing claims seriously, hunting for implicit biases and so on. Especially when they casually encounter something novel like EA for the first time. It is a tall order to expect everyone to move past this and spend significant time engaging with object-level claims EA makes, before forming an opinion.
Do share your thoughts. I wished to initiate a discussion. Please let me know if I can constrain the discussion or make it more meaningful in any way.