Note: views are my own, not my employer's

I don't think these points are particular novel - most introductory EA materials about importance, tractability, and neglectedness, e.g. this from 80k, probably include the necessary caveats. If you don't know what importance, tractability, or neglectedness are, you can start at that link. 

"Importance, Neglectedness, Tractability", or INT, is probably the best three-word summary of how to find high-impact causes. But I find it a little unsatisfying at times, especially the Neglectedness portion.

What EAs are actually looking for is something more like "expected counterfactual marginal net impact". But that's a terrible phrase to introduce to beginners, and INT is a less jargony and more approachable translation/breakdown of how to find true impact.

"Neglectedness" means a few different things

From how I've seen people use it, Neglectedness seems to blend two "fundamental" considerations and one heuristic one. 

The two fundamental considerations: First, what we care about when assessing a cause is tractability at the margin, not just tractability in general. That is, we have to consider the interventions that haven't been done yet or aren't already being done. For example, preventing water-borne diseases in the US is important in the sense that it would be extremely bad if American tap water wasn't treated, and tractable in the sense that water treatment is effective at preventing disease, but it's not very tractable at the margin because tap water is (mostly) already being treated effectively in the US.

The second one is adjusting for counterfactual impact. When one actor decides to address a cause, do other actors put in a little less effort in response (a.k.a. funging)? Would someone else have stepped in eventually?[1] 

The heuristic consideration: Causes that have lower total spending/effort/public attention tend to have higher-tractability interventions available. So independently of the "fundamental" adjustments you have to make to account for pre-existing or counterfactual efforts, causes that are neglected in an absolute sense tend to be good ones. But that's just a heuristic, not something that's true in principle. Arguably, popular causes like climate change and developed-country poverty seem less promising for EAs than other causes that receive much less attention. But if you were doing an explicit cost-benefit calculation, this heuristic will melt away once you come up with the details on what you could actually do at the margin.


None of this is to say that the INT framework is wrong or bad. I just want people to know that it's a simplication and to help them keep these things clear in their minds.


  1. ^

    this is maybe just the same as the first one. whatever.


Sorted by Click to highlight new comments since: Today at 8:59 AM

Borrowing this from some 80k episode of yore, but it seems like another big (but surmountable) problem with neglectedness is which resources count as going towards the problem. Is existential risk from climate change neglected? At first blush, no, hundreds of billions of dollars are going toward climate every year. But how many of these are actually going towards the tail risks, and how much should we downweight them for ineffectiveness, and so on.