Hide table of contents
This is a Draft Amnesty Day draft. That means it’s not polished, it’s probably not up to my standards, the ideas are not thought out, and I haven’t checked everything. I was explicitly encouraged to post something unfinished! It’s also written in bullets because I find it easier to structure arguments that way. See this LessWrong post on some thoughts on using bullet points for forum posts
Commenting and feedback guidelines: I’m going with the default — please be nice. But constructive feedback is appreciated; please let me know what you think is wrong. Feedback on the structure of the argument is also appreciated. If you want to get involved in exploring this further, let me know here.


 

TLDR

Increasing empathy would benefit many cause areas and is neglected. We know how to increase empathy and the skills required are not competing with other EA careers. Let’s make this an EA cause area.

Motivation: Anecdotally and according to people with authority, increasing empathy seems to be a good thing to do

  • Anecdotes suggest that empathy can have large impact on altruistic behavior
    • The founder of the shrimp welfare project intentionally developed empathy for shrimp
    • Many vegans can point to a moment where they took the perspective of animals (eg in documentaries) that led them to become vegans 
    • My personal experience with seeing friends with lower intelligence struggle more than those with higher intelligence led me to research and advocate against “intelligence discrimination”
    • A homeless person recently shared his story with me that included his last job being the person who “opened” cows in a butchery production line. That increased both my commitment to being vegetarian and my compassion with these workers.
    • Rob Wiblin mentioned a few times in his podcast that he has so much empathy that he can’t watch nature documentaries
  • People with authority seem to agree that increasing empathy could be a cause area
    • Owen Cotton-Barratt (previous Director of Research EA) suggested that “increasing empathy” could dominate “improve animal welfare”
    • Robert Wright (80k podcast guest) mentioned that “Cognitive empathy or perspective taking, just understanding the other person’s perspective, that’s my own nomination for a cure-all. If I could change one thing in the world to solve as many of the worlds problem as possible, it would be to make people better at understanding what’s going on in the mind of everyone”  (around 10:10 in this podcast), he is planning to write a book on this
    • Paul Bloom (Yale Professor Psychology): “If I could turn up the dial on cognitive empathy for all humans I would. Because I think most people are fundamentally well intentioned and greater cognitive empathy would lead them to achieve their goals which are for the most part good goals.” (around 31:20 here)

Definitions: By “Increasing Empathy” I mean increasing the number of perspectives taken per capita

  • Empathy: Capacity to take the perspective of others including their feelings, thoughts, experiences
    • Empathy can be split into cognitive (thoughts) and emotional (feelings) empathy, however I don’t find this helpful for this post as both parts are important in perspective taking
      • Cognitive without emotional might be what you call a psychopath, armchair theory around 25:00 here
  • Increasing it: Building a habit of curious perspective taking, increasing “perspectives taken per capita”
    • Somewhat similar to building a scout mindset
    • Mindset/habit only the enabler of the actual outcome: “perspectives taken per capita”
  • I do NOT mean
    • Increasing kindness, sympathy or compassion: More empathy might lead to more compassion but these feelings without empathy are difficult to action on productively
    • Expanding moral circle to new areas (eg AIs, insects)
    • Using empathy as our only moral guide - obviously not a good idea, also see criticism by Paul Bloom on this around 21:40 here

Main argument: Increasing empathy has a high scale, is neglected, and tractable thereby making it a candidate for a cause area

Scale: High because it is a key driver of altruistic behavior, would have a high impact on many cause areas, and robustly a good thing

  • In general, empathy seems to be a key driver of altruistic behavior
  • Many cause areas are neglected because of missing empathy/perspectives relative to others
    • Animals: Less empathy for / perspectives of animals
    • Global Health: Less empathy for / perspectives of far away than closer
    • Global warming: Less empathy for / perspectives of future humans
    • Far future: Less empathy for / perspectives of far future humans
  • Many cause areas would benefit most from more empathy
    • More empathy -> more people caring about this -> more funding, votes, and talent
    • We have solutions to all problems, we need people to act on them
      • Animals: Stop eating them and vote for policies, enabled by taking animal’s perspective
      • Global health: Redistribute money and vote for policies, enabled by taking global south perspective
      • Global warming: Consume less and vote for policies, enabled by taking perspective of “next generation”
      • Far future: Fund research & policy work, enabled by taking perspective of “next next generation”
  • Many cause areas already use increasing empathy as individual interventions
    • Animals: Notice the naming of Compassion in World Farming or Animal Empathy Philippines, high use of documentaries, “sharing stories” of factory farmed animals
    • Global health: That’s how unicef etc work, telling stories getting people to take perspectives, even GiveDirectly is telling stories of recipients and I personally find them motivating
    • Global warming: Empathy with future generations (Greta Thunberg, Fridays for Future etc.) arguably one of strongest forces to motivate people for climate change causes
    • Far future: Eg 80k podcast on “using cognitive empathy to save the world”
    • Cause prioritization meta: Using radical empathy to find “cause x”
    • Increasing rationality: One could argue that the fanfic Harry Potter and the Methods of Rationality was key to make the rational way of thinking accessible to a larger audience
  • Beyond cause areas, side effects would mostly be positive
    • Positive side-effects are probably pretty wide
      • In general inequality (Piketty)
      • Support for discriminated groups (eg immigrants, LGTBQ, people of color, low-intelligence, homeless, …)
      • Probably helpful for superforecasting skills (Robert Wright suggested this in a podcast)
    • Negative side-effects are probably small. Some possible negative effects:
      • Donations driven by empathy to ineffective causes
        • Let’s grow the pie instead of redistributing the pie - total donations are much smaller than they could be, let’s increase it at the risk that some donations won’t be the most effective
      • Too empathetic, negative on mental health
        • Anecdotal evidence from EAs
        • That’s not where we are right now, we need incremental changes, probably most people could use a bit more empathy
        • It will hurt a bit, but that’s probably a good thing
          • Imagine slavery would still be a thing today - would you suggest to not empathise with slaves because that would make us feel sad?
        • Psychology suggests the opposite: More empathy = more life satisfaction
      • EA framed as not-to-be-taken-serious Hippies spreading love
        • If anything, we’re perceived as too cold right now, incremental shift to more loving won’t hurt

Neglectedness: High because it used to be promoted by spiritual traditions and nobody stepped in

  • Practiced by all kinds of traditions in the past but traditions are dying out
    • Christianity: Love your enemy practices
    • Buddhism: Loving kindness meditations
    • New-age: Spread love
  • Nobody taking place of spiritual traditions
    • Modern day individualism in contraction to this
    • Some mindfulness but often with focus on self-improvement / relaxation, less perspective taking
    • Filter bubbles / echo chambers / social media probably reduce perspective taking rather than increasing it
    • Media can have empathy-increasing effects (documentaries, good fiction, …) but seems to mostly be a side effect, no empathy-directed funding.

Tractability: High because we know it increased historically, the skills can be learnt and even if empathy as a skill itself turns out to be more rigid, we can influence the number of perspectives taken per capita

Next steps: EA should focus on this more than we do right now

  • Similar to the longtermist argument, I don’t think all of our efforts should go into increasing empathy, I’m suggesting more than right now should
    • I’m not aware of a single EA charity or org focusing on this
  • 80k lists “improving individual reasoning and decisions” but not “improving empathy” as a promising cause area - isn’t EA the combination of brain and heart, of reasoning and empathy?
    • Closest to this is promoting values and moral circle expansion but it’s not the same
      • Promoting values is different because values are on a societal level whereas empathy is on an individual, more actionable level. Eg as a western society we value each human life equally but on an individual level we’re not doing as much as we could to care for each person equally. See here or here for why trying to influence or lock values might be a bad idea.
      • Moral circle expansion is different because moral circe expansion the way it’s used in EA seems to aim to increase who should be considered a moral agent (eg wild animals and AIs, …) whereas “increasing empathy” argues for increasing our empathy for existing moral agents
  • The skills and personal fit required for most interventions don’t compete with the skills and personal fit required for most other cause areas
    • Most cause areas require very quantitative, analytical skills (AI, programming, )
      • Of all 678 roles on job board right now, 311 are research, 252 operations, 163 policy, 87 software engineering, 73 outreach, 67 management
        • One job can have multiple tags
        • Research, ops, policy, eng all seem very analytical
    • This cause area actually offers opportunities to many people with more interpersonal, creative skills (arts, writing, film making, event organization, game development, …)
      • Not saying these aren’t needed in EA (eg outreach, maybe ops, …) but on balance I feel that there is more supply than demand
      • Would also help to increase diversity in EA
  • There is lots of work to do (disclaimer: these are ideas I brainstormed in a few minutes, not necessarily good ideas)
    • Research ideas
      • Research how to measure impact and effectiveness from an EA perspective of different empathy interventions
      • Research which “grow empathy skills” interventions are most promising
      • Research which “take more perspectives” interventions are most effective
    • Startup ideas
      • “Grow your empathy” app
      • Platform that makes sharing perspectives much easier (eg video calls with people that suffer from discrimination/poverty, with butchers, with AI researchers, … to be used in schools etc)
      • Platform that facilitates 1:1 discussions, e.g. between democrats/republican, straight/LGBTQ, rich/poor
    • Policy ideas
      • Mandatory “social service” as a way to build empathy / broaden perspectives
      • Integrating social housing more as part of housing reform
      • Funding for media/culture projects to be evaluated based on diversity of perspectives shown
    • Charity ideas
      • Something like dollar street for other cause areas
      • Applying effectiveness ideas to culture-focused charities
      • Charity evaluator for empathy building
    • Perspective sharing ideas
      • Writing novels on underrepresented topics
      • Filming documentaries on underrepresented topics
      • Creating immersive VR experiences on underrepresented topics
  • Interested in this topic? Reach out here
    • To help refining these arguments
    • To start something
    • To get updated
Comments4
Sorted by Click to highlight new comments since: Today at 3:04 PM

I'm so glad I found this post. This is the first I've come across another person that feels strongly about increasing empathy potentially being a powerful lever for suffering prevention. I read the post, and plan to go through all the links you included to do further research.

You inspired me to post an essay I wrote a few months ago with a similar theme.

Your mention of Veil of Ignorance gave me a bit of an idea that I brainstormed with ChatGPT, and that you might find interesting:

User: Is there anything here: i think the veil of ignorance could be a powerful motivator for empathy. (Im not sure if this is exactly it): if you imagine you have no way of knowing which body you wake up in tomorrow, or next year, youll be more inclined to promote policies that will help the worst off to reduce the risk if you end up in that body.

Now if this could somehow be minified into a meditation, or game, or something like that, it might train the mind to consider those in the worst situations, and be more likely to help them.

Its possible that empathy is not increased, and it is only selfishness at the thought of risk.

Then again, one can imagine we are not separate beings at all, and are one consciousness being experienced through an infinite number of sentience-moments. Then there is no selfishness, only helping a part of yourself you hadnt been thinking much about.

But just training your brain during the game that you should consider how bad some people have it, might keep you thinking that way after.

ChatGPT: Your idea touches on a fascinating intersection of philosophy, psychology, and technology. The Veil of Ignorance, proposed by philosopher John Rawls, indeed encourages empathy and a broader sense of fairness by asking individuals to design society without knowing their own place within it. Translating this concept into a meditation, game, or interactive experience could be a compelling way to promote empathy and suffering-focused values. 

Meditation:

A guided meditation could involve users imagining waking up in different life situations, and asking them to consider the kind of world they'd want to create from that vantage point. 

Game:

A role-playing game could be designed where the roles you play are randomly assigned, and choices made during gameplay affect the overall well-being of the game world. Players could then see the impact of their decisions from different social and economic perspectives. 

Interactive Experience:

Virtual Reality or Augmented Reality could offer an even more immersive experience of 'walking a mile in someone else's shoes,' thereby promoting empathy and compassion. 

Ethical Considerations:

  1. Selfishness vs Empathy: While the game might initially tap into self-interest, over time, repeated exposure could cultivate genuine empathy.
  2. Universal Consciousness Perspective: Integrating the idea that we're all interconnected could add a philosophical layer to the experience, deepening the impact.
  3. Ongoing Influence: The goal would be that the mental models formed during the game or meditation could carry over into real-world decision-making and behavior.

Spending a few minutes a day brainstorming and outlining how to create such a meditation or game could be an excellent way to contribute to your long-term goals of fostering empathy and reducing suffering.

User: Maybe i could try writing a book from the perspective of someone who is thrust into this situation. He occasionally has opportunities to make decisions that affect the population, maybe as a politician, and he wakes up every day (month, year?) in a new body, experiencing everything from the extreme joy of love to depraved torture to boredom

ChatGPT: That's an intriguing idea for a novel. Writing a book from such a perspective could offer readers a vicarious journey through diverse human experiences, encouraging them to think more deeply about empathy and the reduction of suffering. The protagonist's transition through different bodies and circumstances could serve as a vivid illustration of the Veil of Ignorance concept, particularly if he or she holds some sort of influence over society, like a politician.

You could spend a few minutes each day outlining chapters, developing characters, or even just jotting down ideas and scenes that you'd like to include. This small daily commitment could gradually build into a substantial work that aligns with your broader goal of increasing suffering-focused values in the population. Even if you only write for a few minutes a day, each session would help solidify your ideas and keep your project moving forward.

Glad you liked it and great that you posted your post - it takes some courage here sometimes haha. It's still something I keep thinking about - my main concern is how tractable it is though. I feel it's incredibly hard to significantly change people's character traits, including empathy. If I would spend more time on this I'd probably start there, interview a few psychology professors (like Tania Singer) on their view on if this is even possible, and if that's a yes then start to brainstorm interventions. I don't have much time the next few months but if you have a thesis or something coming up I think it could be a great topic.

That's true, changing people's character is an uphill battle, but it seems that the potential reward is massive enough that all possible interventions should be explored.

Fundamentally, we know that there exist people with varying levels of empathy. (Or at least people who act empathetically in varying degrees.) At the very least, we should figure out what the difference is, through a method like psychology or neuroscience. That would give us a much firmer footing to understand if empathy can be increased, and how.

For now, I still don't have the mental bandwidth to pursue this in any kind of formal capacity. Still not at 100% mental energy after a long period of depression. I'm currently working on becoming an electrician, but I hope I can find a way to steer my career in this direction in the future.

Finding stability in your life should always be first priority and it sounds like you're on a good path. Wishing you lots of empathy and compassion also for yourself on the way!

Curated and popular this week
Relevant opportunities