There’s been a lot of discussion and disagreement over whether EA has a talent or a money gap. Some people have been saying there’s not that large of a funding gap anymore and that people should be using their talent directly instead. On the other hand, others have been saying that there definitely still is a funding gap.
I think both parties are right, and the reason for the misunderstanding is that we have been referring to the entire EA movement instead of breaking it down by cause area. In this blog post I do so and demonstrate why we’re like the blind men touching different parts of the elephant, and how if we put all of it together, we’ll be able to make much better decisions.
Poverty
- Talent gap - Small (~10 people)
- Money gap - Large (~$86 million = ~1720 people doing E2G)
Animal rights
- Talent gap - Large (~100+ people)
- Money gap - Mixed (depends on agreement with Lewis Bollard)
Artificial intelligence
- Talent gap - Middle (~50 people)
- Money gap - Small (most projects are very well funded)
Meta organizations (that fall outside of one of the above areas.)
- Talent gap - Small (~20 people)
- Money gap - Mixed (depends on agreement with Nick Beckstead)
I am not extremely confident on all these numbers (particularly the size of the AI talent gap), but I am confident of the broader claim that the gaps are different between cause areas, and we would all benefit from making that distinction in public discourse. I am happy to update these as people make good arguments for them in the comments. Below I’ll go into further details of how I came to these estimates.
Poverty talent gap
In my experience, poverty organizations generally hire outside of the EA movement for many roles. There are still small gaps for some poverty organizations hiring management and leadership roles from the EA pool (~4). There are also some gaps in operational talent (~2). A part of this gap also comes from the possibility of founding more effective poverty charities (~4), such as a tobacco taxation or conditional cash transfer charity, like what has been done with Charity Science Health and Fortify Health.
Poverty money gap
The gap for money in poverty is huge, even when only looking at charities significantly stronger than Give Directly, whose gap is very large and arguably virtually unlimited. The gap is close to $100 million after Good Ventures funds its portion. There is also reason to expect this gap to grow with recent changes in Good Venture’s funding plans and a strong group of incubation charities in GiveWell’s system. This gap only grows if you think there are strong opportunities in poverty outside of GiveWell’s list. Assuming donating 50% of a $100,000 salary, it would easily take 1,720 people doing E2G to fill this gap. And that is not even including new Givewell incubated/recommended charities!
Animal rights talent gap
The talent gap for animal rights is very large. Many AR organizations are hiring and trying to grow as fast as possible. There is also considerable scope for entrepreneurship and founding new and effective animal rights organizations. The animal rights community as a whole is very small and the number of EAs in the movement is even more limited.
Animal rights money gap
Historically animal rights has been chronically hampered by insufficient funding across the movement. However the entrance of Open Phil to the area has created a very different situation. I now categorize the funding gaps as mixed. The funding is fairly centralized between Open Phil and the AR Funds being run by the same person (Lewis), which controls nearly 50% of all funding in AR. If you have strong agreement with Lewis about the priorities in the area, I would say the funding gap is small. However, if you have very different views, then the funding gap could be seen as large.
Artificial intelligence talent gap
The talent gap for Artificial intelligence is middling, with many organizations in the field in need of researchers as well as some gaps in meta-organizations focusing on meta-research. There are also significant gaps in operational talent to help the support structures of these organizations.
Artificial intelligence money gap
The money gap for AI organizations seems very small, with even large funders being turned away from many projects. Many organizations have very large amounts of funding, and given the recent changes in publicity, much like animal rights, AI went from being chronically underfunded to well funded in almost all areas. Furthermore, due to the fairly wide spread of funders, even people with more unique perspectives on AI will find it hard to find good gaps.
Meta organizations talent gap
Importantly in this section, I mostly consider meta organizations that do not fall under another cause area. For example, ACE would fall under animal rights, not under meta. The talent gap for these organizations generally seems small, with some posted roles in leadership (~7), operations (~3), research (~3) and other general roles (~3) across organizations. There seems to be some scope for founding new charities as well (~4).
Meta organizations money gap
Much like animal rights, there's a lot of centralization of funding with a handful of funders controlling a very high percentage of total funding. Like in animal rights, there is one person who controls the EA funds on meta-organizations and is the lead investigator for Open Phil. Thus I think an EA’s perspectives on funding gaps will largely depend on how well their views align with Nick Beckstead’s. This gap can range from very small to moderate sized (low millions) depending on how broadly you define meta-organizations.
Overall, as you can see, the talent and money gaps vary largely depending on the cause. If you think poverty is the highest impact area, earning to give is a very good choice. On the other hand, if you think animal rights is the best, figuring out how to best give your talents might be a better way forward. If you agree with Lewis, that is. Regardless of what cause you think is highest priority and what you think the gaps truly are, breaking them down by cause area will help everybody make better decisions.
In 2015 you (Benjamin) wrote a post which, if I'm reading it right, aspires to answer the same question, but is in very direct contradiction with the conclusions of your (Katherine's) post regarding which causes are relatively talent constrained. I would be interested in hearing about the sources of this disagreement from both of you (Assuming it is a disagreement, and not just the fact that time has passed and things have changed, or an issue of metrics or semantics)
here is the relevant excerpt
https://80000hours.org/2015/11/why-you-should-focus-more-on-talent-gaps-not-funding-gaps/
It sounds like both of you (Katherine and Benjamin) agree that AI is "talent constrained". Pretty straightforward, it's hard to find sufficiently talented people with the specialized skills necessary.
It sounds like the two of you diverge on global poverty, for reasons that make sense to me.
Katherine's analysis, as I understand it, is straightforwardly looking at what Givewell says the current global poverty funding gap is...which means that impact via talent basically relies on doing more good with the existing money, performing better than what is currently out there. (And how was your talent gap estimated? Is it just a counting up of the number of currently hiring open positions on the EA job board?)
Benjamin's analysis, as I understand it, is that EA's growing financial influence means that more money is going to come in pretty soon, and also that effective altruists are pretty good at redirecting outside funds to their causes (so, if you build good talent infrastructure and rigorously demonstrate impact and a funding gap, funding will come)
Is this a correct summary of your respective arguments? I understand how two people might come to different conclusions here, given the differing methods of estimating and depending on what they thought about EA's ability to increase funding over time and close well demonstrated funding gaps.
(As an aside, Benjamin's post and accompanying documents made some predictions about the next few years - can anyone link me to a retrospective regarding how those predictions have born out?)
It sounds like you diverge on animal rights, for reasons I would like to understand
Benjamin, it sounds like you / Joe Bockman are saying that ending factory farming is exceptional among popular EA causes in having more talent than they can hire and being in sore need of funding.
Whereas Katherine, it sounds like you're saying that animal rights is particularly in need of talent relative to all the other cause areas you've mentioned here.
These seem like pretty diametrically opposed claims. Is this a real disagreement or have I misread? I'm not actually sure what the source of this disagreement is, other than Katherine and Joe having different intuitions, or bird's eye views of different parts of the landscape? Has Joe written more on this topic? If it's just a matter of two people's intuitions, it doesn't leave much room for evaluating either claim. (I get the sense that Katherine's claim isn't based on intuition, but the fact that EA animal organizations are currently expanding, which increases the estimated number of open job postings available in the near future. Is that correct?)
(Motivation: I'm reading this post now as part of the CE incubation program's reading list, and felt surprised because the conclusions conflicted with my intuitions, some of which I think were originally formed by reading Benjamin's posts a few years ago. As the program aims to set me on a path which will potentially help me cause redirection of funding, redirection of talent, create room for more talent, and/or create room for more funding within global poverty or animal issues, the answers to these questions may be of practical value to me.)
I'd be happy if either of you could weigh in on this / explain the nature and sources of disagreement (if there is in fact a disagreement) a bit more!
(PS - can I tag two people to be notified by a comment? Or are people notified about everything that occurs within their threads?)