I don't intend to convince you to leave EA, and I don't expect you to convince me to stay. But typical insider "steel-manned" arguments against EA lack imagination about other people's perspectives: for example, they assume that the audience is utilitarian. Outsider anti-EA arguments are often mean-spirited or misrepresent EA (though I think EAs still under-value these perspectives). So I provide a unique perspective: a former "insider" who had a change of heart about the principles of EA.
Like many EAs, I'm a moral anti-realist. This is why I find it frustrating that EAs act as if utilitarianism is self-evident and would be the natural conclusion of any rational person. (I used to be guilty of this.) My view is that morality is largely the product of the whims of history, culture, and psychology. Any attempt to systematize such complex belief systems will necessarily lead to unwanted conclusions. Given anti-realism, I don't know what compels me to "bite bullets" and accept these conclusions. Moral particularism is closest to my current beliefs.
Some specific issues with EA ethics:
- Absurd expected value calculations/Pascal's mugging
- Hypothetically causing harm to individuals for the good of the group. Some utilitarians come up with ways around this (e.g. the reputation cost would outweigh the benefits). But this raises the possibility that in some cases the costs won't outweigh the benefits, and we'll be compelled to do harm to individuals.
- Under-valuing violence. Many EAs glibly act as if a death from civil war or genocide is no different from a death from malaria. Yet this contradicts deeply held intuitions about the costs of violence. For example, many people would agree that a parent breaking a child's arm through abuse is far worse than a child breaking her arm by falling out of a tree. You could frame this as a moral claim that violence holds a special horror, or as an empirical claim that violence causes psychological trauma and other harms, which must be accounted for in a utilitarian framework. The unique costs of violence are also apparent through people's extreme actions to avoid violence. Large migrations of people are most associated with war. Economic downturns cause increases in migration to a lesser degree, and disease outbreaks to a far lesser degree. This prioritization doesn't line up with how bad EAs think these problems are.
Once I rejected utilitarianism, much of the rest of EA fell apart for me:
- Valuing existential risk and high-risk, high-reward careers rely on expected value calculations
- Prioritizing animals (particularly invertebrates) relied on total-view utilitarianism (for me). I value animals (particularly non-mammals) very little compared to humans and find the evidence for animal charities very weak, so the only convincing argument for prioritizing farmed animals was their large numbers. (I still endorse veganism, I just don't donate to animal charities.)
- GiveWell's recommendations are overly focused on disease-associated mortality and short-term economic indicators, from my perspective. They fail to address violence and exploitation, which are major causes of poverty in the developing world. (Incidentally, I also think that they undervalue how much reproductive freedom benefits women.)
The remaining principles of EA, such as donating significant amounts of one's money and ensuring that a charity is effective in achieving its goals, weren't unique enough to convince me to stay in the community.
Yea as a two-level consequentialist moral anti-realist I actually am pretty tired of EA's insistence of "how many lives we can save" instead of emphasizing how much "life fulfillment and happiness" you can spread. I always thought this was not only a PR mistake but also a utilitarian mistake. We're trying to prevent suffering, so obviously preventing instances where a single person goes through more suffering on the road to death is more morally relevant utils-wise than preventing a death with less suffering.
Nonetheless, this is the first I've heard that violence and exploitation are under-valued by EA's. It always seemed the case to me that EAs generally weep and feel angsty feelings in their gut when they read about the violence and exploitation of their fellow man. But, what can we do? Regions of violence are notoriously difficult for setting up interventions that are tractable. As such it always seeemed to me that we should focus on what we know works since lifting people out of disease and poverty empowers them to address issues of violence and exploitation themselves. And giving someone their own agency back in this way is, in my view, something worth putting a lot of moral weight on due to its long-term (albeit hard-to measure) consequences.
And now I'm going to say something that I feel some people probably wont like.
I consistently feel that a lot of the critique on EA has to do with how others perceive EAs rather than what they are really like. i.e prejudice. I mentioned above that I generally feel EAs are legit moved to tears (or whatever is a significant feeling for them) regarding issues of violence. But, I find that as soon as this person spends most of his/her time in the public space talking about math and weird utilitarian expected value calculations this person is suddenly viewed as no longer having a heart or "the right heart." The amount of compassion and empathy a person has is not tied to what weird mathematical arguments they push out but what they do and feel inside (this is how I operationalize "compassion" at any rate: an internal state leading to external consequences. Yes I know, that's a pretty virtue ethics way to look at it, so sue me.).
Anyway, maybe part of this is because I know what it feels like to be the highschool nerd that secretly cries when he sees someone getting bullied at break time but who then talks to people about and cevelops exstensivly resaeched weird ideas like transhumanism as a means of optimizing the human flourishing (instead of say caring to go to an anti-bullying event that everyone instead thinks I should be going to if I really cared about bullying). It makes sense to me that many people think I have my priorities wrong. But it certainly isn't due to a lack of compassion and concern for my fellow man. It's not too hard to go from this analogy and argue that
This is perhaps what I absolutely love about the EA community. I've finally found a community of nerds where I can be myself and go in depth with uber-weird (any and all) ideas without being looked at as any less compassionate <3.
When people talk about ending violence and exploitation by doing something that will change the system that keeps these problems in place I get upset. This "system" is often invisible and amorphous and a product of ideology rather than say cost-effectiveness calculations. Why this gets me upset is that I often find this means people are willing to sacrifice giving someone their agency back when it is clear you can do so through donating to proven disease and poverty alleviation interventions to instead donate/support a cause against violence and exploitation because it aligns with their ideology. This essentially seems to me a way of making donation about yourself - trying to make sure you feel content in your own ethical worldview because specifically not doing anything about that violence and exploitation makes you feel bad - rather than making it about the individuals on the receiving end of the donation.
Yea I know, my past virtue ethics predilections are showing again. Even if someone like what I've described above supports an anti-violence cause that though difficult to get a effectiveness measure from is still nontheless doing a lot of good in the world we cant measure I still don't like it. I'm caring what people think and arguing that certain self-serving thoughts appear morally problematic independent of the end-result they cause. So let me show I'm also strongly opposed to forms of anti-realist virtue ethics. It's not enough to merely be aligned with the right way of thinking/ideology etc and then good things come from that. The end result: the actual people on the receiving end - are what actually matter. And this is why I find a "mostly" utilitarian perspective so much more humanizing than people a lot of people who get uncomfortable with its extreme conclusions and then reject the whole thing. A more utilitarian perspective forces you to make it about the receiver.
Whatever the case, writing this has made me sad. I'm sad to see you go, you seem highly intelligent and a likely asset to the movement, and as someone who is on the front-line of EA and PR I take this as a personal failure but wish you the best. Does anyone know of any EA-vetted charities working on violence and exploitation prevention? Even ones that are a stretch tractability-wise would be good. I'd like to donate - always makes me feel better.
What do you mean by 'we'? Negative utilitarians?