BJ

Bastian Jaeger

54 karmaJoined May 2022

Comments
4

I'm curious how strongly your scale would correlate with contructs that sound very similar to scout mindset, such as actively open-minded thinking and intellectual humility. There are probably also scales out there that tap into truth-seeking as a motive. I bring this up because one problem that we're experiencing in (social) psychology is the staggering number of constructs and scales that are being introduced even though most of the variance in these differences is already captured by existing constructs (for example, "grit" got a lot of attention for predicting various performance outcomes, but it seems like it's mostly repackaged conscientiousnss). 

Maybe there's something unique to the scout mindset idea - I don't have that deep of an understanding of what it entails and how this overlaps with existing constructs, so it's definitely worth checking out and I'm curious what you'll find. For example, you could test if your scale predicts a lot of variance in some reasoning/judgment outcome that should be determined by scout mindset even after you control for actively open-minded thinkung and intellectual humility. There's is also this cool tool where you can input the items of your scale and it shows how much it overlaps with existing scales: https://rosenbusch.shinyapps.io/semantic_net/.

You're right that both empathy and compassion are typically used to describe what determines people's motivation to relieve someone's suffering. Neither perfectly captures preventive thinking or consideration of interests (beyond welfare and suffering) that characterize longtermist thinking. I think you are right that compassion doesn't lead you to want future people to exist. But I do think that it leads you to want future people to have positive lives. This point is harder to make for empathy. Compassion often means caring for others because we value their welfare, so it can be easily applied to animals or future people. Empathy means caring for others because we (in some way) feel what it's like to be them or in their position. It seems like this is more difficult when we talk about animals and future people. 

I would argue that empathy, how it is typically described, is even more local and immediate, whereas compassion, again, how it is typically described, gets somewhat closer to the idea of putting weight on others' welfare (in a potentially fully calculated, unemotional way), which I think is closer to EA thinking. This is also in line with how Paul Bloom frames it: empathy is the more emotional route to caring about others, whereas compassion is the more reflective/rational route. So I agree that neither label captures the breadth of EA thinking and motivations, especially not when considering longtermism. I am not even arguing very strongly for compassion as the label we should go with. My argument more is that empathy seems to be a particualrly bad choice.

Agreed - I think Paul Bloom's distinction makes a lot of sense. Many prominent empathy researchers have pushed back on this, mostly to argue for the Empathy 3 definition that I listed, but I don't see any benefit in conflating these very different processes under one umbrella term.

Let me make a case that we should call it Radical Compassion instead of Radical Empathy. This is a very minor point of course, but then again, people have endlessly debated whether Effective Altruism is a sub-optimal label and what a better label would be. People clearly care about what important things are called (and maybe rightly so from a linguistic precision and marketing perspective).

You probably know this literature, but there's a lot of confusion around what empathy should be defined as. Sometimes, empathy refers to various perspective-taking processes, like feeling what another feels (let's call it Empathy 1). I think this is the most common lay definition. Sometimes, it refers to valuing others' welfare, also referred to as empathic concern or compassion (let's call it Empathy 2). Sometimes, definitions reference both processes (let's call it Empathy 3), which doesn't seem like the most helpful strategy to me.

Holden briefly points to the debate in his post which you link to, but it's not clear to me why he chose the empathy term despite this confusion and disagreement. In one place, he seems to endorse Empathy 3, but in another, he separates empathy from moral concern, which is inconsistent with Empathy 3.

I think most EA's want people to care about the welfare of others. It doesn't matter if people imagine what it feels like being a chicken that is pecked to death in a factory farm (that's going to be near-impossible), or if they imagine how they would feel in factory farm conditions (again, very difficult to imagine). We just want them to care about the chicken's welfare. We therefore want to promote Empathy 2, not 1 or 3. Given the confusion around the empathy term, it seems better to stick with compassion. Lay definitions of compassion also align with the "just care about their welfare" view.