Hide table of contents

This post is a personal reflection on certain attitudes I have encountered in the EA community that I believe can be misleading. It is primarily based on intuition, not thorough research and surveys.

It is not news that the EA community has an unbalanced demographic, with men in the majority.

I have heard from several women what they dislike about the EA community and this post is what I have taken from those conversations. I think that if we can move more in the direction I'm describing, the EA community can become warmer and more welcoming to all genders and races (and also more effective at doing good).

I'd like to note that I don't think what I'm about to describe is a widespread problem, but a phenomenon that may occur in some places. Most of my experiences with the EA community have been very positive. I meet mostly caring people with whom I can have interesting, sometimes controversial discussions. And I often meet people who are very willing to help.

Now to the subject:

Some women I have spoken to have described a "lack of empathy" in the group, or, more specifically, that EA people came across as "tech bros" who lacked humility and wouldn't help a stranger because it wouldn't be the most effective thing to do. In an introductory discussion group we ran (in our university group), one of the participants perceived some of EA's ideas as "cold-hearted" and was very critical of the abstract, sometimes detached way of trying to calculate how to do good most effectively.

I believe that these impressions and experiences point to risks associated with certain EA-related ideas.

The idea of optimization

Firstly, the idea of optimising/maximising one's impact is fraught with risks, which have been described already here, here and here (and maybe elsewhere, too). 

To judge between actions or causes as more or less worthy of our attention can certainly seem cold-hearted. While this approach is valuable for triage and for prioritising in difficult situations, it also has a dark side when it justifies not caring about what we might normally care about. We should not discredit what might be judged as lesser goods just because some metric suggests it. It shouldn't lead us to lose our humility (impacts are uncertain and we are not omniscient) as well as our sense of caring.

What kind of community are we if people don't feel comfortable talking about their private lives because they don't optimise everything, don't spend their free time researching or trying to make a difference? When people think that spending time volunteering for less effective non-profits might not be valued or even dismissed? What is the point of an ineffective soup kitchen, after all it is a waste of time in terms of improving QALYs?

I have no doubt that even the thought of encountering such insensitive comments makes you feel uncomfortable.

The following quote might appear to conflict with the goal of EA, but I think it doesn't and makes and important point.

“There is no hierarchy of compassionate action. Based on our interests, skills and what truly moves us, we each find our own way, helping to alleviate suffering in whatever way we can.” Joseph Goldstein (2007) in A Heart Full of Peace 

What we are trying to do is called Effective Altruism, not Altruistic Effectiveness, and we should be trying to be altruistic in the first place, that is, good and caring people.[1]

The idea of focusing on consequences

I also think that an exaggerated focus on consequences can be misleading in a social context, as well as detrimental in terms of personal well-being. Even if one supports consequentialism, focusing on consequences may not be the best strategy for achieving them.

One reason is that, as Stoic philosophy tells us, we can't control the outcomes of our actions.

Another is that if we cling to them, they can distract us from what it means to live an ethical life. When we focus on consequences rather than valuing effort and intention, what it means to be a good person is subject to considerable moral luck.

I think this focus on consequences, which is widespread in EA, can also lead to unhealthy social dynamics. One might be tempted to value effectiveness over kindness, intelligence over caring. I have heard some people say something like "well, I am not that impactful", with a painful touch of imposter syndrome, comparing themselves to other people who have done more impressive things within the EA movement.

Even when we try to see each other as a team working towards a common goal of making the world a better place, we often can't help but form opinions of other people by judging them based on what we value. We are animals who play status games, even when we may not like the idea. So our shared values, what we care about, are important to our community. And I think that these ideas, effectiveness and a focus on consequences, if overly endorsed, can make a community less welcoming and less warm-hearted. This starts with whether we frame and explain EA in the abstract as "trying to do good as effectively as possible" or as "trying to help people and animals as well as we can", and ends with the values we consider most important.

Conclusions

In conclusion, I think that rather than being overly focused on finding the most effective means of doing good, we should also be concerned with becoming more altruistic, caring and compassionate. We should not neglect to care about things that are difficult or impossible to measure by focusing on reason and rationality. We must remain humble, grounded and warm-hearted. Even when dealing with important technical issues such as AI safety or other areas of long-term concern (and especially in those cases), it is important to nurture human qualities and not allow them to recede into the background. I'm not trying to devalue these causes, but they might become an easy way to rationalise away and disconnect from the suffering that is happening right now. Because it can be painful to open up to and connect with suffering, it can be tempting to find a good reason not to.

EA isn't about turning off the heart because it might lead to bias, it's about turning on the head. And when they are in conflict, we need to stop and think carefully about what price we might pay in trying to be as effective as possible. Good and caring actions that may seem ineffective can be very important for reasons that we cannot put into a metric.

We can and should learn not only how to think more rationally, but also how to become more caring, for compassion is a skill we can train. This quality of caring, the intention to help, is the soil on which EA can grow. It is linked to our personal and interpersonal well-being and we should emphasise the importance of nurturing it rather than depleting it by becoming too attached to certain ideas.[2][3][4][5]

A few resources on practicing compassion 

  • Practicing mindfulness (A wandering mind is a less caring mind[6])
  • Practicing to care (e.g. through loving-kindness meditation)[7][8]
  • There are other resources available, such as from the CCARE at Stanford

 References and acknowledgements

I like to thank Ysa Bourgine for her reflections and ideas that contributed to this post and everyone else with whom I had good conversations about these topics.

  1. ^

    I took this expression from Emil Wasteson

  2. ^

    Jazaieri H, Jinpa GT, McGonigal K, et al. Enhancing Compassion: A Randomized Controlled Trial of a Compassion Cultivation Training Program. J Happiness Stud. 2013;14(4):1113-1126. doi:10.1007/S10902-012-9373-Z/TABLES/3

  3. ^

    Jazaieri H, McGonigal K, Lee IA, et al. Altering the Trajectory of Affect and Affect Regulation: the Impact of Compassion Training. Mindfulness (N Y). 2018;9(1):283-293. doi:10.1007/S12671-017-0773-3/FIGURES/5

  4. ^

    Quaglia JT, Soisson A, Simmer-Brown J. Compassion for self versus other: A critical review of compassion training research. J Posit Psychol. 2021;16(5):675-690. doi:10.1080/17439760.2020.1805502

  5. ^

    Klimecki OM, Leiberg S, Ricard M, Singer T. Differential pattern of functional brain plasticity after compassion and empathy training. Soc Cogn Affect Neurosci. 2014;9(6):873-879. doi:10.1093/SCAN/NST060

  6. ^

    Jazaieri H, Lee IA, McGonigal K, et al. A wandering mind is a less caring mind: Daily experience sampling during compassion meditation training. J Posit Psychol. 2016;11(1):37-50. doi:10.1080/17439760.2015.1025418

  7. ^

    Hutcherson CA, Seppala EM, Gross JJ. Loving-Kindness Meditation Increases Social Connectedness. Emotion. 2008;8(5):720-724. doi:10.1037/A0013237

  8. ^

    Fredrickson BL, Boulton AJ, Firestine AM, et al. Positive Emotion Correlates of Meditation Practice: a Comparison of Mindfulness Meditation and Loving-Kindness Meditation. Mindfulness (N Y). 2017;8(6):1623-1633. doi:10.1007/S12671-017-0735-9/TABLES/2

26

9
11

Reactions

9
11

More posts like this

Comments14
Sorted by Click to highlight new comments since: Today at 9:50 PM

Just wondering if you can acknowledge that EA is not for everyone? I guess I feel a lot "safer" about these types of critiques to change the culture and overt focus when people acknowledge that. There are ways I would tweak EA culture in some places to lead to a bigger and broader community. There also ways I would not, and there are people who I think would never be happy with EA's values unless it already described what they are already interested in and already believe, which is very far from and conflicts with EA. And those people will never like EA until we forsake the effectiveness focus at all.

For example, this person:

In an introductory discussion group we ran (in our university group), one of the participants perceived some of EA's ideas as "cold-hearted" and was very critical of the abstract, sometimes detached way of trying to calculate how to do good most effectively.

(Not saying you didn't do this but) If it were me leading that group I might poke at that a little and help them think differently. Because I think often people like that have picked up a slight mistaken impression along the way. And based on where that misunderstanding was, it would be great to be able to replace or augment parts of the curriculum, even if it is just with your own disclaimer. You could ask them what would prove them wrong about that conception? You could ask them if they think there is ever a way to compare interventions and how they would do so?

But I also might just say something like, "okay well this group and movement might not be for you then and that's okay. You would meet plenty of warmhearted people in EA, and I think by definition it might be wrong to call altruistic tactics coldhearted. But if you have such a strong reaction, you probably won't ever be happy with the goals and tactics of the movement. And that's fine. Good luck in your altruistic endeavors"

I wonder if you think that person would ever really be a good fit for EA?

It is frustrating to see people bounce off the movement saying it is too cold. And it might be a reason to tweak the intro curriculum or work on the culture in your group. There is something going on there, and it does happen often, and I don't like it either. But I don't think it warrants coming to EAs and saying anything approaching "you guys actually are cold and need to work on compassion". I actually don't think they* are? EAs are the warmest most compassionate people I know. (*I had said "we" originally but removing myself from this as I'm one of the colder ones in writing style. But even I have cried many a tear for animals, minorities, people in poverty, etc)

[Edit: I accidentally flowed into a bunch of suggestions, but originally I wanted it to have a questioning tone and just wanted to know what you think]

On that person objecting on coldness, could one not frame it as not being about being cold, but about expanding the warmth and care to others? To me, thinking about broiler chickens and neglected families in malaria ridden areas is not cold at all. What is cold to me is to not think of them when making a choice about career or donations. If anything, my reading of numerical and scientific analysis of broiler chickens has increased my feeling of warmth towards those animals, without even the slightest reducing the warmth I feel toward the homeless person out in the snow. I am quite certain many other EAs feel the same way. I just do not see a conflict here? 

And I also think we should be open to and positive towards people with less "warmth" but a desire to help in a more dispassionate manner. As long as one wants to help as much as possible, I do not see why it matters that much how much of the "fuzzies" they get from helping?

I totally agree with what you're saying, that there doesn't have to be a conflict. The way you describe it, I think that extending care through rational reflection is exactly how things can work out very beautifully.

And I also agree with your second point here, that caring doesn't mean getting "warm fuzzy feelings". And since being welcoming involves being non-judgmental about people, of course we should be open and positive to all!

I think that this is a really good point. I shudder at the idea that "EA is not for everyone" because I want to make spaces inclusive and welcoming, and I hate the feeling of being excluded from things for (what I percieve as) no good reason... but I think that that recognizing the idea that EA maybe really isn't for everyone has a lot of truth.

In a simple sense, some people just don't like taking the warm, fuzzy, feel-good empathy out of decisions. But also, some people don't have the money or the skills to contribute.

Thank you Ivy!
I acknowledge that EA may not be for everyone. And I don't want to make EA popular at any cost.
What matters to me is the reason why it might not be for everyone. If someone is just cringing at some unsympathetic social behaviour, or generally disagrees with the ideas, but still feels welcome.
I think it is important to maintain the effectiveness mindset while being careful not to become somewhat sociopathic or come across as a robot, but to remain friendly and approachable as a human being.

Regarding the introductory fellowship, we had a very engaged discussion and the reason for their impression was to a large extent a misunderstanding of the idea. And it is this possibility of misunderstanding the idea that I wanted to highlight.
If, after such a discussion and clarification, they still don't really feel that this would be something for them, I have no problem with wishing them well and good luck and letting them go. But I would also say "the door is open, you are welcome to talk to us again if you like".

I also didn't want to say "you're really cold and need to work on your compassion", I think that would be quite a weird thing to do, honestly. As I tried to mention in the beginning, I feel very lucky to know so many wonderful people in the community. I was just trying to point out risks that I see and the value of these virtues of kindness. So that we don't lose them along the way, but continue to cultivate them.

TImon - the whole point of EA was to get away from the kind of vacuous, feel-good empathy-signaling that animated most charitable giving before EA. 

EA focuses on causes that have large scope, but that are tractable and neglected. These three criteria are the exact opposite of what one would focus on if one simply wanted to signal being 'warm' and 'empathic' -- which works best when focusing on specific identifiable lives (small scope), facing problems that are commonly talked about (not neglected), and that are intractable (so the charity can keep running, without the problem actually getting solved).

In my view, it's entirely a good thing that EA has this focus. And it's inevitable that some people who can't understand scope-sensitivity would feel like it's 'heartless' and overly quantitative.

It's helpful to near in mind psychologist Paul Bloom's distinction between 'empathy' and 'rational compassion'. EA, as I understand it, tries to do the latter.

I agree with all this, and I also think the OP might be speaking to some experiences in EA you might not have had which could result in you talking past each other.

Thanks Geoffrey for raising this point. I agree that emotional empathy as defined by Paul Bloom can lead to bias and poor moral judgement, and I also appreciate the usefulness of the rational EA ideas you describe. I don't want to throw them out the window and agree with Sam Harris when he says "Reason is nothing less than the guardian of love".
I agree that it is important to focus on effectiveness when judging where to give your money. I was trying to make a very different point.

I was trying to make the point that we should not dismiss the caring part that might still be involved in well-intentioned but poorly executed interventions. And I have tried to make the case for being kind and not dismissing human qualities that do not appear to be efficient. I have tried to show how following these ideas too much, or in the wrong way, can lead to negative social consequences, and that it is important to keep a balance.

In the context of the less effective charities you describe, the problem I see is not warmth or caring, but bias and naivety. To care is to understand. To understand the cause of suffering and the best way to alleviate it. 
I would also like to point out that while Paul Bloom makes a clear case for the problems with emotional empathy and moral judgement, at the end of the book he emphasises its value in social contexts. Also, I was not trying to argue for this kind of empathy, but basically talking about emotional maturity, compassion and kindness. I think you can make kindness impartial, so that it is consistent with moral values, but also so that other people feel that they are dealing with a human being, not a robot.

I'm not advocating going back to being naive and prejudiced, but rather being careful not to exclude human traits like empathy in everyday social interactions just because they might lead to bias when thinking about charity. Wisdom requires emotional as well as rational maturity.

In conclusion, I think that rather than being overly focused on finding the most effective means of doing good, we should also be concerned with becoming more altruistic, caring and compassionate.

 

I strongly agree with the last half of this sentence. A rocket engine is only valuable insofar as it is pointed in the right direction. Similarly to how it makes sense to practice using spreadsheets to systematize ones decision-making, I think it make sense to think about ways to become more compassionate and kind.

I disagree for another reason too: I think we should be a movement that is welcoming and feels like home to both the more dispassionate and the more caring among us. I think we might even become stronger by having such a range of emotional drives behind our ambition to do the most good.

I have mixed feelings about this, but broadly speaking I would love to see more interpersonal warmth, agreeableness, friendliness, and general empathy in EA.

I see it as vaguely aligned with virtue ethics (in the vague sense of "be a good person"), and I think that the standard EA mix (heavily consequentialism with a little bit of deontology) would benefit from just a little bit more virtue ethics.

What kind of community are we if people don't feel comfortable talking about their private lives because they don't optimise everything, don't spend their free time researching or trying to make a difference? When people think that spending time volunteering for less effective non-profits might not be valued or even dismissed? What is the point of an ineffective soup kitchen, after all it is a waste of time in terms of improving QALYs?

 

I can't tell if you're saying "these should be universally treated as highly effective" or "people are entitled to do ineffective things". I strongly agree with the latter, but disagree with the former even for core EA interventions so certainly don't believe it applies to soup kitchen volunteers. 

There's also the question of if volunteering is harmful (my impression is Christmas-only soup kitchen volunteers are net costs, and soup kitchens tolerate food drives in the hopes it will lead to money later.), but that's a separate issue. 

The point I was trying to make is that it is important to value care and that we should be cautious not to judge or discredit or look down on things that may not appear to be maximally effective or optimised. I would not evaluate a soup kitchen intervention as highly effective but also think that doesn't matter in this context. Not everything has to be optimized.

Good essay. I agree that warm-heartedness is an important value or quality that people in the EA movement would be better off promoting. Cold-heartedness is the dark side of EA. 

Curated and popular this week
Relevant opportunities