Hide table of contents

There have been several surveys of attendees of what was previously called the ‘EA Leaders Forum’ about what they believe the community’s cause priorities should be (2017, 2018, 2019).

The relevance of these results have been criticised each year for not representing the views of the wider community (e.g. in 2019). You can see some hypotheses about why there might be differences in views between the Leaders Forum group and others here. In particular, the former group is selected to be more into longtermism than the average, potentially skewing the results.

Similarly, CEA and 80,000 Hours have been criticised for not having their priorities line up with the community (e.g. this post summarises criticisms about representativeness, and 80k was recently criticised here).

Critics of the Leaders Forum survey often appeal to the EA Survey as a reflection of the community’s priorities. This is a much broader survey with about 2,500 respondents, and when we compare the ideal portfolio chosen in the Leaders Forum Survey to the top cause preferences of the EA Survey respondents (as explained in more depth later), we see significant differences:

The problem with this is the EA Survey is open to anyone to take, and without knowing more about who’s taking it, it’s unclear why this should be used to set community priorities.

Fortunately, the 2019 EA Survey lets us make progress on this question. It asked many more questions about people’s level of engagement, which means we can further subdivide the responses.

In this post, I do a quick analysis of these new results to show that based on this data, the Leaders Forum survey priorities seem to reflect the most engaged half of the community – around 1,000-2,000 people – pretty well. This suggests the issue we actually face is not a difference between the leaders and the core of the community, but rather between the core and those who are new or moderately engaged.

The data

In the 2019 EA Survey, one question asked about how engaged people feel they are on a scale of 1 to 5, where 5 was defined as I am heavily involved in the effective altruism community, perhaps helping to lead an EA group or working at an EA-aligned organization. I make heavy use of the principles of effective altruism when I make decisions about my career or charitable donations.”[1]

About 450 out of 2,100 respondents to this question reported a “5”. My estimate is that only about 40% of engaged EAs filled out the survey in 2019,[2] which would suggest there are around ~1,000 people with this level of engagement, so it’s a much wider group than the ~30 or so people at the leaders forum.

If we look at what different members of this group said they think is the ‘top cause’, and compare that to the ‘ideal portfolio’ chosen by the mean attendee of the Leaders Forum,[3] (making some assumptions about how the categories line up) we find:

As you can see, these line up pretty well, except that the EA Survey group actually has a greater collective preference for ‘meta’ cause prioritisation work. (Note that ‘other near term’ wasn’t offered as a category on the leaders forum survey so may be undercounted – I hope this will be fixed this year.)

Using broader buckets, among the leaders survey we get 54% longtermist, 20% meta, and 24% near term and 2% other. Among the survey respondents we find 48% longtermist, 28% meta, and 24% near term.

(The 2020 Leaders Forum results are also not published, but at a glance they line up even better, with more agreement on AI and other near term causes.)

I should be very clear: these questions are asking about different things. The Leaders Forum one is about the ideal portfolio of resources, while the question in the EA survey is about what people believe is the top cause. However, it seems notable that leaders think resources should be allocated roughly lines up the distribution of most-engaged EAs' top cause areas.[4]

To double check whether this is a problem, the EA Survey also asked people about which cause they currently work on. When normalized to 100% and compared again to the mean Leaders Forum respondents' ideal distribution of resources, this showed:

Again, the picture is fairly similar, suggesting that the allocation of labour is not far from what the leaders guess is the ideal portfolio.

The main difference is that AI seems underinvested in (perhaps suggesting practical career support should focus here), and ‘other near term’ may be overinvested in. Note that the Leaders Forum question was about how both money and time should be allocated e.g. global health receives more funding via GiveWell than any other issue, so if we were looking at funding as well, global health would be over-allocated towards.

One potential problem with this analysis is that you might think it’s circular: it asks about things like working at an EA org or attending EAG, and maybe it’s hard to do these if you’re not already into longtermism. If that’s true, then it would be no wonder that engaged EAs are, like Leaders Forum attendees, pretty into longtermism. What other samples might be better?

In Joey Savoie’s piece arguing in favour of greater representativeness, he suggests surveying EA Forum respondents as a good proxy for what’s representative. We now have this data too:

Again, the picture is similar, suggesting that the views of those taking the Leaders Forum survey views line up with Forum users fairly well, who also line up with ‘5/5 engaged’ community members. There are about 1,500 active users of the Forum in total.

I also quickly looked at cause preferences among those who reported that they work at EA orgs, and again the picture was similar.

Likewise, if you look at ‘length of time involved’ among all EA survey respondents as a proxy for engagement, you see a clear tilt towards longtermist issues compared to the whole sample.

If we broaden to consider those who reported 4 or 5 out of 5 on engagement, then the differences get a bit larger. “4” on engagement was defined as: "I’ve engaged extensively with effective altruism content (e.g. attending an EA Global conference, applying for career coaching, or organizing an EA meetup). I often consider the principles of effective altruism when I make decisions about my career or charitable donations.”

For instance, the longtermist percentage is now 43% compared to 54% among the leaders, and 33% is near-termist compared to 24%. Still, these strike me as fairly modest differences.

People who clicked “4” or “5” represent about the top half of EAs who responded to the question in terms of engagement (approx 900 people). My guess is that there are over 2000 people at about this level of engagement in the community in total.

If we look at those who reported 3 out of 5 on engagement, then the most popular two causes are Global Poverty (29.8%) and Climate Change (19.5%), which is now a significant difference with the leaders forum survey and the most engaged members.

In brief, newer and less engaged members of the community are more likely to stick to more common sense priorities, which is not surprising.

What might this mean?

Nowadays at least,[5] the Leaders Forum respondents represent the priorities of the most engaged 1,000-2,000 people pretty well.[6]

At 80k, we recently said we’d put around 50% of effort into our priority paths, 30% into other ways of helping solve our priority problems & other issues we list here, and 20% in other problems, such as global health and factory farming. This also lines up with the priorities for the core of the community.

My view is that it would be reasonable to design some other infrastructure, such as EA Global, around the priorities of the core ~2000 people also.

Instead of there being a difference between the Leaders Forum respondents and the core, the divergence is mainly between the core and those who are new or moderately engaged (‘the middle’).

This suggests that the main issue we face is how to best form a bridge between new members and the core. As long as the core focuses on unusual priorities – which using neglectedness as a heuristic for prioritization will mean is likely – there’s a risk that new members get surprised when they find out about these unusual priorities, putting off people who (i) could have a significant impact in other problem areas, or (ii) would later become convinced of the core’s priorities.

I think what to do about this is a really difficult question, which I’m not going to go into here besides to say the following: It’s tempting to try to solve this by ‘meeting people where they are’, and talking a lot more about climate change and global health. But I don’t expect this indirect approach is better at finding people who might end up in the core, it means we spend less time on higher-impact areas (where we still have a lot of gaps), and it can lead to the perception of a 'bait and switch' when people discover the 'true' priorities. These are some of the reasons we moved away from that approach at 80k. Today our approach is to state our priorities directly, and to focus on explaining them as clearly and reasonably as we can, while trying to remain welcoming to those interested in other areas.

Thank you to David Moss for helping to prepare the data and charts, and to Aaron for posting.


  1. 1: No engagement: I’ve heard of effective altruism, but do not engage with effective altruism content or ideas at all 2: Mild engagement:_ I’ve engaged with a few articles, videos, podcasts, discussions, events on effective altruism (e.g. reading Doing Good Better or spending ~5 hours on the website of 80,000 Hours)_ 3: Moderate engagement:_ I’ve engaged with multiple articles, videos, podcasts, discussions, or events on effective altruism (e.g. subscribing to the 80,000 Hours podcast or attending regular events at a local group). I sometimes consider the principles of effective altruism when I make decisions about my career or charitable donations._ 4: Considerable engagement: I’ve engaged extensively with effective altruism content (e.g. attending an EA Global conference, applying for career coaching, or organizing an EA meetup). I often consider the principles of effective altruism when I make decisions about my career or charitable donations. ↩︎

  2. This is based on looking at how many people took the survey from specific organisations, and comparing to their total number of employees. It’s also based on an informal poll of CEA staff. ↩︎

  3. Medians seem to match slightly better, if anything. ↩︎

  4. One difference is that the leaders might think a cause is less effective, but should still receive an allocation in the portfolio. Whereas the EA survey respondents were asked about which cause they think is most effective. Relatedly, the leaders might think the portfolio should be representative rather than reflect their own views. ↩︎

  5. It’s true that this shift might have happened due to the priorities of the leaders in the past, and perhaps at that point there was a disconnect between the core and the Leaders Forum respondents. However, for decisions going forward, we should focus on the community as it exists today. ↩︎

  6. Though there could still be differences here that aren’t picked up in this method. For instance, maybe the leaders think we should ideally allocate even more to ‘meta’ but that we don’t have enough people able to do it, so they set the portfolio lower. In addition, the EA survey asked about peoples’ top cause priority, while the Leaders Forum Survey asked about the ideal portfolio – someone might think a cause should be in the portfolio even if it’s a lower priority. I think there are also difficult communication and cultural challenges in how we talk about the issues that receive a smaller allocation in the portfolio. ↩︎

Comments26
Sorted by Click to highlight new comments since: Today at 7:22 PM

Thank you for looking into the numbers! While I don't have a strong view on how representative the EA Leaders forum is, taking the survey results about engagement at face value doesn't seem right to me.

On the issue of long-termism, I would expect that people who don't identify as long-termists to now report to be less engaged with the EA Community (especially with the 'core') and identify as EA less. Long-termism has become a dominant orientation in the EA Community which might put people off the EA Community, even if their personal views and actions related to doing good haven't changed, e.g. their donations amounts and career plans. The same goes for looking at how long people have been involved with EA - people who aren't compelled by long-termism might have dropped out of identifying as EA without actually changing their actions.

Hi Denise, on the second point, I agree that might be a factor (I mention it briefly in the article) among others (such as people changing their minds as in David's data). My main point is that this means the problem we face today is more like "people are bouncing off / leaving EA because the most engaged ~2000 people focus on unusual causes" rather than "the leaders don't represent the most engaged 2000 people".

taking the survey results about engagement at face value doesn't seem right to me

Not sure I understand – how do you think we should interpret them? Edit: Nevermind, now I get it.

Regarding the latter issue, it sounds like we might address it by repeating the same analysis using, say, EA Survey 2016 data? (Some people have updated their views since and we'd miss out on that, so that might be closer to a lower-bound estimate of interest in longtermism.)

Fortunately we have data on this (including data on different engagement levels using EA Forum as a proxy) going back to 2017 (before that the cause question had a multi-select format that doesn't allow for easy comparison to these results).

If we look at the full sample over time using the same categories, we can see that there's been a tendency towards increased support for long-termist causes overall and a decline in support for Global Poverty (though support for Poverty remains >50% higher than for AI. The "Other near term" trend goes in the opposite direction, but this is largely because this category combines Climate Change and Mental Health and we only added Mental Health to the EAS in 2018.

Looking at EA Forum members only (a highly engaged ~20% of the EAS sample), we can see that there's been a slight trend towards more long-termism over time, though this trend is not so immediately obvious to see since between 2018 and 2019 EAs in this sample seem to have switched between AI and other long-termist causes. But on the whole the EA Forum subset has been more stable in its views (and closer to the LF allocation) over time.

Of course, it is not immediately obvious what we should conclude from this about dropout (or decreasing engagement) in non-longtermist people. We do know that many people have been switching into long-termist causes (and especially AI) over time (see below). But it's quite possible that non-longtermists have been dropping out of EA over a longer time frame (pre-2017). That said, I do think that the EA Forum proxy for engagement is probably more robust to these kinds of effects than the self-reported (1-5) engagement level, since although people might dropout of Forum membership due to disproportionately longtermist discussion, the Forum still has at least a measure of cause diversity, whereas facets of the engagement scale (such as EA org employement and EA Global attendance) are more directly filtering on long-termism. We will address data about people decreasing engagement or dropping our of EA due to perceiving EA as prioritizing certain causes too heavily in a forthcoming EA Survey post.

Both images from the EA Survey 2019: Cause Prioritization

Yes, that is what I meant. Thank you so much for providing additional analysis!

Thank you for preparing these - very interesting!

I share Denise's worry.

My basic concern is that Ben is taking the fact there is high representativeness now to be a good thing while not seeming so worried about how this higher representativeness came about. This higher representativeness (as Denise points out) could well just be result of people who aren't enthused with the current leaders' vision simply leaving. The alternative route, where the community change their minds and follow the leaders, would be better.

Anecdotally, it seems like more of the first has happened (but I'd be happy to be proved wrong). Yet, if one thinks representativeness is good, achieving representativeness by having people who don't share your vision leave doesn't seem like a good result!

I'm also not sure I know what you mean.

I think the point is that some previously highly engaged EAs may have become less engaged (so dropped out of the 1000 people), or some would-be-engaged people didn't become engaged, due to the community's strong emphasis of longtermism. So I think it's all the same point, not two separate points.

I think I personally know a lot more EAs who have changed their views to longtermism than EAs who have dropped out of EA due to its longtermist focus. If that's true of the community as a whole (which I'm not sure about), the main point stands.

This is very much an aside, but I would be really curious how many people you perceive as having changed their views to longtermism would actually agree with this. (According to David's analysis, it is probably a decent amount.)

E.g. I'm wondering whether I would count in this category. From the outside I might have looked like I changed my views towards longtermism, while from the inside I would describe my views as pretty agnostic, but I prioritised community preferences over my own. There might also be some people who felt like they had to appear to have or act on longtermist views to not lose access to the community.

Some may also have started off longtermist without that being obvious - I knew I was a total utilitarian and cared about the long run future from ~2009, but didn't feel like I knew how to act on that until much later. So I guess from the outside my views may look like they changed over the last couple of years in a way they didn't.

Yeah, I think this is worth taking seriously. (FWIW, I think I had been mostly (though perhaps not completely) aware that you are agnostic.)

Ben, could you elaborate on how important you think representativeness is? I ask, because the gist of what you're saying is that it was bad the leaders' priorities were unrepresentative before, which is why it's good there is now more alignment. But this alignment has been achieved by the priorities of the community changing, rather than the other way around.

If one thought EA leaders should represent the current community's priorities, then the fact the current community's priorities had been changed - and changed, presumably, by the leaders - would seem to be a cause for remorse, not celebration.

As a further comment, if representativeness is a problem the simple way to solve this would be by inviting people to the leaders' forum to make it more representative. This seems easier than supposing current leaders should change their priorities (or their views on what they should be for the community).

I'm not aiming to take a stance on how important representativeness is. My goal is to get people to focus on what I see as the bigger issue we face today: how should we design a community when the new members and the "middle" have mainstream cause priorities and the "core" have (some) unusual ones?

I've strong upvoted this piece because I think the analysis looks really good and it taught me something new. I really liked all of it except for the last few paragraphs, which seemed to suggest long-termism should be ~all of EA's focus rather than about half.

Hey Khorton, I didn't mean to imply that. I think the last paragraphs still stand as long as you assume that we'll want some of the core of EA to work on unusual causes, rather than 100%.

This is a good analysis but I think it simplifies between short vs long term, when people often aren't 100% one or the other. As well as whether particular cause areas are short vs long term when some existential risk work could be seen as highly valuable even if an individual didn't value lives tomorrow and some interventions that are seen as near term could have a much bigger impact on the future.

I agree, I don't like the near-termism vs. longtermism terms, since I think it makes it sound like a moral issue whereas it's normally more about epistemology or strategy, and like you say for most people it's a matter of degree. I hope we can come up with better terms.

I also agree people should be clear about 'causes' vs. 'worldviews'. You could be longtermist in your worldview but want to work on economic empowerment, and you could be near termist but want to work on AI GCRs.

I did my analysis in terms of causes, though my impression is that results are similar when we ask about worldviews instead (because in practice causes and worldviews are reasonable correlated).

I was attracted to EA several years ago and delighted to have finally found people who share my desire to make the world a better place. I, like many others I have met since, was struggling to know how to do that most effectively. For me, the goal of EA should always be to make the world a better place and the way to do this is not to create a cult where you have to be in the core to be valued.


EA needs a blend of people. It needs some people identifying effective activities. It needs some people doing those effective activities. It needs some people funding those effective activities. And it needs some people trying to drum up other people to do those things. The people who earn-to-give to these activities will not be counted by you as 'core', but are as necessary as the other people.

I still think this post was making an important point: that the difference in cause views in the community was because the most highly engaged several thousand people and the more peripheral people, rather than between the 'leaders' and everyone else.

I really love the new section in Key Ideas, "Other paths that may turn out to be very promising". I've been concerned that 80K messaging is too narrow and focuses too much on just your priority paths, almost to the point of denigrating other EA careers. I think this section does a great job contextualizing your recommendations and encouraging more diverse exploration, so thanks!

Somewhat related: I'm guessing 80K focuses on a narrower range of priority paths partially because, as an organization, specializing is valuable for its own sake. If there's a dozen equally impactful areas you could work in, you won't put equal effort into each area - you're better off picking a few and specializing, even if the decision is arbitrary, so you can reap returns to scale within the industry by learning, making connections, hiring specialists, and building other field-specific ability.

If you actually think about things this way, I would suggest saying so more explicitly in your Key Ideas, because I didn't realize it for a long time and it really changes how I think about your recommendations.

(Unrelated to this post, but hopefully helpful)

Nice post. I have always thought this was a problem with the argument - so much so that when I read your opening paragraphs I found myself getting annoyed that you were rehashing it and not considering the obvious objection... the obvious objection which was in fact the entire purpose of the post.

As long as the core focuses on unusual priorities – which using neglectedness as a heuristic for prioritization will mean is likely – there’s a risk that new members get surprised when they find out about these unusual priorities

Perhaps there are also some good reasons that people with different life experience both a) don't make it to 'core' and b) prioritize more near term issues.

There's an assumption here that weirdness alone is off-putting. But, for example, technologists are used to seeing weird startup ideas and considering the contents.

This suggests a next thing to find out is: who disengages and why.

In EA London, women were equally like to attend an EA event for the first time but less and less likely to attend further events*. Anecdotally, some of these women didn't really see how attending EA events would help them to do good. https://forum.effectivealtruism.org/posts/2RfQT7cybfS8zoy43/are-men-more-likely-to-attend-ea-london-events-attendance

*EA London has since changed its approach and now focuses less on events.

I'm interested in why climate change isn't called out as topic in the chart above - is it merged into 'other longtermist' or 'other near term' in the charts above? In the 2019 survey, it was the second highest priority (and the fastest growing one), and I understand that 80K has updated (also explicitly here) to seeing it as part of the longtermist portfolio.

Yes, unfortunately the leaders forum survey didn't ask about it as its own category, so it's merged into the others you mention.