Commissioning an ethnography or routine anthropological observation of EA communities could be good for our epistemic hygiene. A lot of the big differences of opinion in EA today don't come down to empirical matters, but priors and values. It's difficult to get anywhere using logic and debate when the real difference between sides is, say, how realistic a catastrophe feels or whether you lean negative utilitarian. One productive way I see to move forward is identifying the existence of strong motives or forces that lead us to hold certain beliefs besides their truth value.

With longtermism, EA is trending into areas where it's difficult to make short-term testable predictions that could expose motivated reasoning or bias (let alone unforeseen complications). Without many empirical checks available in EA's new hot topics, I don't know how to adjudicate between my biases and everyone else's. That's why I think it would be extremely interesting to see what a sociologist or ethnographer had to say about this topic and everything else we do.

What I'd want is a breakdown of the social dynamics and the role that beliefs play in that from someone who won't get bogged down in the content of EA beliefs-- just a descriptive analysis of what beliefs and their associated behaviors are doing what in our system. Particularly, I'd want to know what beliefs they viewed as serving important social functions, i.e. having the greatest reason to persist without being true.

I have my issues with anthropology as a field but I generally approve of the practices around ethnography. I think an ethnography of EA would be a valuable outside opinion that could offer unique access to difficult to debug areas of our thinking. Though not my top priority, such a document might also reveal promising areas for movement-building or previously unappreciated community vulnerabilities.

I'm just curious what people think about this idea for now. Please let me know if this has already been done (yay!). I haven't heard of an anthropological study of EA yet and a quick google search didn't produce anything relevant. Even if it has been done before, EA moves so fast that it could surely profitably be done again. I just learned googling for this post that businesses getting ethnographies for the kind of reasons I'm listing is a thing, so we might be able to get someone like that without having to interest someone in us for their own research. An analysis of the social function of our beliefs is potentially such a good epistemic check that perhaps we should hire someone to do it on a regular basis or try to get someone interested in us for a dissertation.

Sorted by Click to highlight new comments since:

I think ethnography could be useful. But what I really want is for people to spend more time discussing why they make donations, prioritize certain causes above others, etc.

People write about this on the Forum all the time, but the number of people who post on the Forum is a tiny fraction of the number who donate a lot of money, want to work in a certain field, etc. I don't mind if people have lots of hand-wavey bits in their models (lord knows I do); I mostly want to see what kinds of reasons they think they have:

  • How many of us mostly make decisions by putting a lot of trust in EA organizations?
  • How many of us found that the decisions we were making already matched up with what EA organizations recommended?
  • How many of us do any kind of independent analysis of orgs we support, or even read what those orgs write about themselves?

...and so on. Invisible motives can be very powerful, but they don't have to be invisible. (Now that I've said this, I realize I should write up a "where I'm giving and why" post at the end of this year; thanks for the inspiration, Holly!)

Can you spell out why you'd like to see that? As read I your comment I immediately thought 'I would also like to see this' and then realised I wasn't sure why self-reports of reasons would be useful.

This could be a long essay, but here are the two points which most stand out to me:

1. I'd like a culture of more honesty/transparency in EA around, specifically, charitable giving; it's a huge part of the movement, but few people talk openly about their own giving decisions, which seems like it has a few different bad effects (for example, making it seem like direct work is a much bigger part of EA than it is, thus increasing the pressure on people to do direct work and feel like donating doesn't matter).

2. I want to learn from people who have spent time thinking about giving, even if those thought processes aren't completely clear or unbiased. I can't possibly follow all of the interesting charities that might appeal to EAs, so seeing where people give is often really informative for me.

(I work for CEA, but these views are my own.)


Seems like there are a lots of incentive effects & cognitive biases that'd be activated when someone writes up a public-facing account of their prioritization & donation decisions.

Well, the idea would be to try and write your way through those biases and incentives as best you can -- the idea being that EA should have a culture where it's fine to not have all the numbers and to have a personal pull in certain directions, as long as you can recognize this. I'd guess that 90+% of Giving What We Can members don't have really distinct personal models for their donations, for example, and I'd be interested to hear how they choose instead.

the idea would be to try and write your way through those biases and incentives as best you can

I think a crux here is that I'm bearish about the community being able to collectively write its way through this in a way that's positive on net.

It seems like you're more bullish about that.

(I agree that getting more truth-tracking info about why folks are making the decisions they make is a good goal. I think we have a tactical disagreement about how to surface truth-tracking information.)

I think that if a lot of people tried to do this, few would fully succeed, and most would mostly fail, but that we'd all learn a lot in the process and get better at bias-free belief reporting over time.

The EA community has become unusually good at some forms of communication (e.g. our online discussions are more civil and helpful than those almost anywhere else), and I think that's partly a function of our ability to help each other improve through the use of group norms, even if no group member fully adheres to those norms.

I think that if a lot of people tried to do this, few would fully succeed, and most would mostly fail, but that we'd all learn a lot in the process and get better at bias-free belief reporting over time.

Right. I'm modeling some subset of the failures as negative expected value, and it's not obvious to me that the positive impact of the successes would outweigh the impact of these failures.

The EA community has become unusually good at some forms of communication (e.g. our online discussions are more civil and helpful than those almost anywhere else)

Totally agree. I don't understand why our communication norms are so good (compared to benchmarks).

Because I don't have a believable causal model of how this came to be, I have a Hayekian stance towards it – I'm reluctant to go twiddling with things that seem to be working well via processes I don't understand.

I'm reluctant to go twiddling with things that seem to be working well via processes I don't understand.

To me, one of the things that has "worked well" historically has been "people in EA writing about why they've made decisions in great detail". These posts tend to be heavily upvoted and have often been influential in setting the tone of discussion around a particular topic. I don't think people should be forced or pressured to write more of them, but I also don't see why more of them would turn the sign from positive to negative.

Ben Hoffman's latest feels tangentially relevant to our disagreement here.

... but I also don't see why more of them would turn the sign from positive to negative.

There's probably strong selection effects here.

People write up things / spotlight things that are straightforward to justify and/or make them look good.

People avoid things / downplay things that are opaque and/or unflattering.

(speculative) Perhaps more posts like this would increase the selection pressure, leading to a more distorted map of what's going on / more distance between the map and the territory.

Zvi's recent post feels tangentially relevant to our disagreement here:

This is a world where all one cares about is how one is evaluated, and lying and deceiving others is free as long as you’re not caught. You’ll get exactly what you incentivize.

To the extent that ethnography is anonymized, I could imagine people speaking more freely than they do in blog posts, interviews where they're identified, etc.

I see this as something of a different question, i.e. "What portion of this disagreement is due to factors we can access through self-reflection and rationally discuss?" I would want the ethnography to get at things we're too embedded in to see.

To what extent should we fund a counter organisation to, say, 80000 hours to reresearch its decisions - an independent watchdog so to speak?

The term "counter organization" sounds like a bad place to start. I think we currently live in a world where EA organizations are generally pretty transparent about their reasoning and open to being challenged in public, so I'm not sure what a specific "independent watchdog" might accomplish, but I'd be curious to see more details of a proposal in a Forum post!

(I work for CEA, but these views are my own.)

Good point. I originally interpreted the comment to mean just an independent take on 80k topics, and I'm super-supportive of that, but I agree with you that it shouldn't be adversarial.

Maybe post this as a separate question?

I remember a couple of people doing something slightly similar to this.

Dan Artus wrote a dissertation in 2018 - "​An​ ​ethnographic​ ​exploration​ ​of​ ​ethics,
empathy​ ​and​ ​data​ ​practises​ ​within​ ​the​ ​London​ ​Effective​ ​Altruist​ ​community"

Nick Philips wrote a thesis about the EA movement in 2015 -"Rational Faith: A Study of the Effective Altruism Movement "

Link for Artus' dissertation?

Cool! I've never heard of these so thank you very much.

I've been really interested by the amount of times I've found myself and/or others surprised by seeing that the EA community does something that nearly all other communities do (e.g., infight, unfairly exclude an outgroup, unfairly prefer something or someone high status). I think better awareness of this could be valuable and we may be able to learn a good deal more from the successes and failures of other communities.

I feel like growing up religious (and especially having lots of different Protestant sects in the family) gives me insight that a lot of people in EA who were raised secular don't have. I think it's because we think of those failure modes as having to do with irrational religion (like believing in the supernatural) and not the rational approach we're taking. Short of getting a specific study of EA, I think most EAs would benefit from learning about the history of social and especially religious movements to see how much we are like them.

Giles Fraser summed up EA London's atmosphere as 'an evangelical youth group' - not in a mean way - and I've frequently worried that we'll undergo something akin to a church split. The parallels are quite obvious if you're familiar.

Here is an edited version of the dissertation mentioned earlier. It has had most of the non EA London related content removed to help make it more relevant.

An​ ​ethnographic​ ​exploration​ ​of​ ​ethics, empathy​ ​and​ ​data​ ​practises​ ​within​ ​the​ ​London​ ​Effective​ ​Altruist​ ​community

Haven't had a chance to read much but it's already gold

How does one tag someone with lots of money in this post?


I phrase this in jest, but mean it in all seriousness - the rhetoric at the moment is 'be more ambitious' because we are less cash constrained than before, but maybe we should add to this 'be more ambitious, but doubly as self-critical as before'.

If I give you money, can you make this happen? 

Oh man, I would love to try, even if all I do is locate and pay someone else who can find an ethnographer.

Yeah I'd know how to go about making this happen, including figuring out what's a decent research question for it, but not undertaking it myself.

I don't think the research question is the hard part, compared to finding the people to do it. If you're interested in this I'd be happy to see a proposal on it, including who you think is good to research stuff here! :)

Interesting, I think it's the other way round; there are tonnes of companies and academic groups who do action-oriented evaluation work which can include (and I reckon in some cases exclusively be) ethnography. But in my experience the hard part is always "what can feasibly be researched?" and "who will listen and learn from the findings?"  In the case of EA community this would translate to something like the following, which are ranked in order of hardest to simpler...:

  • what exactly is the EA community? or what is a representative cross-section / group for exploration?
  • who actually wants to be surveilled and critiqued; to have their assumptions and blindspots surfaced in a way that may cast aspersions on their actions and what they advocate for? especially if these are 'central nodes' or public(ish) figures
  • how can the person(s) doing ethnography be given sufficient power and access to do their work effectively?
  • what kind of psychological contracts need to be engendered so that the results of this research don't fall on deaf ears? and how do we go about that?
  • what things do we want to learn from this? should it be theory-driven, or related to specific EA subject-matter (e.g. long-termism)? or should the ethnographer be given a wider remit to do this work?

I'd be happy to have a conversation about what this could look like - maybe slightly more useful than a paper because I suspect there are an unhelpful amount of potential misunderstanding potholes in this area, so easier to clarify by chatting through. 

I think that EAs are, at least ostensibly, very open to being studied and critiqued. I think they could be an excellent population for academic ethnographers or simply very compliant client community for action-oriented evaluation.

Here's a podcast I listened to years ago which has influenced how I think about groups and what to be sceptical about; most specifically what we choose not to talk about.

This is why I'm somewhat sceptical about how EA groups would respond to an offer of an ethnography; what do people find uncomfortable to talk about with a stranger observing them, let alone with each other?

How, if at all, do you envision this differing from some of the portrayals of EAs in Strangers Drowning?

I was imagining it as more of a population study than case studies or biographies. More of a study of EA the movement than the stories of individuals involved in EA.

Curated and popular this week
Relevant opportunities