Hide table of contents

This is the eighth article in the EA Survey 2017 Series. You can find supporting documents at the bottom of this post, including previous EA surveys conducted by Rethink Charity, and an up-to-date list of articles in the series. Get notified of the latest posts in this series by signing up here.

By Anna Mulcahy, Tee Barnett, and Peter Hurford

Summary

We asked self-identified EAs how they first heard about the movement and what resource or tool persuaded them to get more involved. It’s important to bear in mind, however, the limitations related to both the questions and also the respondents ability to recall from possibly long periods ago[1].

 

  • The number of people joining the EA movement each year continues to increase year-on-year.

  • The top five sources of introduction to EA in descending order are ‘Personal Contact’, ‘Lesswrong’, ‘Other book, article, or blog post’, ‘SlateStarCodex’, and ‘80,000 Hours’

  • As of 2016, LessWrong dropped out of the top five list of introductory sources after being one of the top three from 2009 to 2015.

  • The top five sources of engagement for new EAs in 2017 in descending order are ‘GiveWell’, ‘Book or Blog’, ‘80,000 Hours’, ‘Personal Contact’, and ‘Giving What we Can’.

What year did EAs first get involved with EA?

EA survey results from 2017 show an increase in the number of new members, confirming trends published in “Is EA Growing? Some EA Growth Metrics for 2017”. Results show growth of nearly 20% in the number of new recruits to the community for 2016, compared to 2015. This certainly reflects gains in recruitment year-on-year, though without efforts to track attrition rates it is possible that the total community growth could be less than what is suggested here.

How did people first hear about EA?

All-time figures for first introductions to EA were topped by ‘Personal Contact’ and “Lesswrong’ with ‘Other book, article, or blog post’ coming in a distant third. Scott Alexander’s SlateStarCodex (SSC) and 80,000 Hours round out the top five. Important to note is a considerable proportion of individuals who selected ‘Other’, which would place it as the third most popular answer if counted among specific referral sources.

Responses were then cross-referenced against the question, “In roughly which year did you first get involved in EA?”. This allowed for the 2017 results to be interpreted within a longer arc of EA surveys conducted in the last few years, and provided some indication about how the most successful sources for spreading the word about EA have changed over time.

 

We can also see how referrers have changed over time by cross-referencing people’s self-report of how they got involved with the year they report joining the movement. When interpreting the 2017 results within this context, we find that getting introduced to EA through personal networks has historically been most common (Table 3).

As for year-on-year trends according to particular referral sources, we can find several examples of noteworthy changes over time. For instance, Lesswrong was a wellspring of new EAs for several years before the community faded. 80,000 Hours is typically among the top referrers, and while SlateStarCodex has as also been important over the years according to this survey, potential for survey bias due to over-sampling SSC readers persists.


From 2009 to 2011, Giving What We Can ranked highly in response to the question “How did you first hear about EA?”. However, after 2011 it progressively fell in popularity and did not even rank in the top five ways of first hearing about EA from 2014 to 2016. In addition, the number of people who learned about EA through 80,000 hours almost doubled from 2014 to 2015 (see Table 4)[2]. Slate Star Codex has also shown increasing success as a referral source for EA since 2014 (see Table 5). Again, care must be used when interpreting these trends, as there have been fluctuations in how much each group promoted the EA Survey.

As mentioned previously, despite LessWrong dropping out of the top five in 2016, the historical strength of the rationalist website in drawing EA-adjacent individuals suggests that it may have been an obvious choice for EA Grants to support the newest iteration of LessWrong. 

 

Comparison with a Survey of the EA Facebook Group

 

The EA Facebook group has become a popular place for the EA Community. Indeed, 54.6% of EAs in our survey sample report being in the group, and almost 18% of EA survey respondents were referred from Facebook. Notably, when people join the EA Facebook group, as a condition of joining, every member is asked to report how they heard about EA as freeform text. Julia Wise and other EA FB moderators collected a convenience sample of 100 responses collected in late 2017 and produced the following results:

 

To compare this to our data, we selected the 406 EAs who self-reported being a member of the EA Facebook group and who said they joined in 2016 or 2017 (though this would only go up to April-June 2017 when the survey was active). Among this subsample in our survey, the top five results were 19% saying personal contact, 17% saying “other”, 10% saying 80,000 Hours, 7% saying Doing Good Better, and 6% saying a TED Talk. This matches closely with the results gathered from Facebook despite a different data collection method (forced response for group membership vs. voluntary survey taking) and reporting methods (self-report from choices including others vs. self-reported freeform text with no prompts).

What got people more involved with EA?

Respondents were also asked what motivated them to get involved with EA. While the previous question can indicate the reach and accessibility of EA resources, this question can be used to indicated how effective these resources are at persuading people to join the EA community and actively participate.  

 

Introduction sources and sources of further engagement are not always one in the same. As seen in Table 7, personal networking did not come out as the top source for actually getting people involved in EA, though it remains within the top five. Once introduced to EA, it would appear GiveWell, books and/or blogs, and 80,000 Hours are the three most potent ways to keep engage new EAs. This may come as no surprise considering these answer options offer a wealth of in-depth information. LessWrong would presumably also fall into this camp, but the rationalist website may have fallen down the list due to reasons cited above. EA Global (EAG) performed quite well considering the relatively brief amount of time new EAs have to engage at a given conference.

Endnotes

[1] As mentioned in previous articles, care should be taken when interpreting EA survey results. Questions to identify where people first heard about EA are open to significant human error as respondents are required to rely on memory and recall something that may have happened up to 5 or more years ago. Furthermore, respondents could have heard about EA from multiple sources in a short period of time, but may not be able to pinpoint exactly which of those sources they heard about it from first. Having ‘cannot remember’ as an option can only reduce errors from memory recall up to a point.

 

The same potential for error applies when asking respondents to recall what caused them to actually get involved in EA. Although for this question they were given the opportunity to select multiple answers, as multiple factors often contribute to such a decision, so it relied less on accurate recall of a single, specific event.

 

[2] This may be the case for a few reasons. 80,000 Hours assisted this year in distributing the survey, which was not the case in 2016 because no EA survey was conducted. According to CEO and Co-founder, Ben Todd, 80,000 Hours web traffic nearly doubled each year for the past few years. And finally, the Effective Altruism Facebook group survey posted by Julia Wise illustrates the popularity of 80,000 Hours as a popular referral source among new members.

 

[3]: The full text of the question was "Which factors were important in 'getting you into' Effective Altruism, or altering your actions in its direction? Check all that apply.”

Credits

Post written by Anna Mulcahy, Tee Barnett, and Peter Hurford, with edits from Ben Todd.

 

The annual EA Survey is a volunteer-led project of Rethink Charity that has become a benchmark for better understanding the EA community. A special thanks to Ellen McGeoch, Peter Hurford, and Tom Ash for leading and coordinating the 2017 EA Survey. Additional acknowledgements include: Michael Sadowsky and Gina Stuessy for their contribution to the construction and distribution of the survey, Peter Hurford and Michael Sadowsky for conducting the data analysis, and our volunteers who assisted with beta testing and reporting: Heather Adams, Mario Beraha, Jackie Burhans, and Nick Yeretsian.

 

Thanks once again to Ellen McGeoch for her presentation of the 2017 EA Survey results at EA Global San Francisco.

 

We would also like to express our appreciation to the Centre for Effective Altruism, Scott Alexander via SlateStarCodex, 80,000 Hours, EA London, and Animal Charity Evaluators for their assistance in distributing the survey. Thanks also to everyone who took and shared the survey.

EA Survey 2017 Series Articles

I - Distribution and Analysis Methodology

II - Community Demographics & Beliefs

III - Cause Area Preferences

IV - Donation Data

V -  Demographics II

VI - Qualitative Comments Summary

VII - Have EA Priorities Changed Over Time?

VIII - How do People Get Into EA?

 

Please note: this section will be continually updated as new posts are published. All 2017 EA Survey posts will be compiled into a single report at the end of this publishing cycle

 

Prior EA Surveys conducted by Rethink Charity (formerly .impact)

The 2015 Survey of Effective Altruists: Results and Analysis

The 2014 Survey of Effective Altruists: Results and Analysis

 

Comments7
Sorted by Click to highlight new comments since:

I'm not sure how useful this data is given that there are major distribution effects. ie. If I distribute the survey through Less Wrong, I'll find a lot of people who first heard of the movement through Less Wrong, ect.

Yes - 80,000 Hours only briefly mentioned the survey at the end of one newsletter. If we had promoted it more heavily, we could have probably got more than twice as many submissions, and they would be more tilted towards people who first found out about EA from 80,000 Hours. As a comparison, our annual impact survey gets around 1000 responses each year, which would make it about the same size as this survey.

Yep, that is an issue. One idea might be to look at the data for each referral source (e.g., how everyone who heard about the survey through Facebook heard about EA, then how everyone who heard about the survey though SlateStarCodex heard about EA, etc.).

I agree, this is something we acknowledge multiple times in the post, and many times throughout the series. The level of rigor it would take to bypass this issue is difficult to reach.

This is also why the section where we see some overlap with Julia's survey is helpful.

Did SlateStarCodex even exist before 2009? I'm sceptical - the post archives only go back to 2013: http://slatestarcodex.com/archives/. Maybe not a big deal but does suggest at least some of your sample were just choosing options randomly/dishonestly.

They could also be referring to earlier writing by the same author at other addresses.

Re: Table 7 and the sentences "As seen in Table 7, personal networking did not come out as the top source for actually getting people involved in EA, though it remains within the top five. Once introduced to EA, it would appear GiveWell, books and/or blogs, and 80,000 Hours are the three most potent ways to keep engage new EAs."

The Personal Contact # is higher than the 80k hours # in the table (442 > 423), so either the sentences or the table need to be corrected, I think.

More from Tee
Curated and popular this week
 ·  · 23m read
 · 
Or on the types of prioritization, their strengths, pitfalls, and how EA should balance them   The cause prioritization landscape in EA is changing. Prominent groups have shut down, others have been founded, and everyone is trying to figure out how to prepare for AI. This is the first in a series of posts examining the state of cause prioritization and proposing strategies for moving forward.   Executive Summary * Performing prioritization work has been one of the main tasks, and arguably achievements, of EA. * We highlight three types of prioritization: Cause Prioritization, Within-Cause (Intervention) Prioritization, and Cross-Cause (Intervention) Prioritization. * We ask how much of EA prioritization work falls in each of these categories: * Our estimates suggest that, for the organizations we investigated, the current split is 89% within-cause work, 2% cross-cause, and 9% cause prioritization. * We then explore strengths and potential pitfalls of each level: * Cause prioritization offers a big-picture view for identifying pressing problems but can fail to capture the practical nuances that often determine real-world success. * Within-cause prioritization focuses on a narrower set of interventions with deeper more specialised analysis but risks missing higher-impact alternatives elsewhere. * Cross-cause prioritization broadens the scope to find synergies and the potential for greater impact, yet demands complex assumptions and compromises on measurement. * See the Summary Table below to view the considerations. * We encourage reflection and future work on what the best ways of prioritizing are and how EA should allocate resources between the three types. * With this in mind, we outline eight cruxes that sketch what factors could favor some types over others. * We also suggest some potential next steps aimed at refining our approach to prioritization by exploring variance, value of information, tractability, and the
 ·  · 5m read
 · 
[Cross-posted from my Substack here] If you spend time with people trying to change the world, you’ll come to an interesting conundrum: Various advocacy groups reference previous successful social movements as to why their chosen strategy is the most important one. Yet, these groups often follow wildly different strategies from each other to achieve social change. So, which one of them is right? The answer is all of them and none of them. This is because many people use research and historical movements to justify their pre-existing beliefs about how social change happens. Simply, you can find a case study to fit most plausible theories of how social change happens. For example, the groups might say: * Repeated nonviolent disruption is the key to social change, citing the Freedom Riders from the civil rights Movement or Act Up! from the gay rights movement. * Technological progress is what drives improvements in the human condition if you consider the development of the contraceptive pill funded by Katharine McCormick. * Organising and base-building is how change happens, as inspired by Ella Baker, the NAACP or Cesar Chavez from the United Workers Movement. * Insider advocacy is the real secret of social movements – look no further than how influential the Leadership Conference on Civil Rights was in passing the Civil Rights Acts of 1960 & 1964. * Democratic participation is the backbone of social change – just look at how Ireland lifted a ban on abortion via a Citizen’s Assembly. * And so on… To paint this picture, we can see this in action below: Source: Just Stop Oil which focuses on…civil resistance and disruption Source: The Civic Power Fund which focuses on… local organising What do we take away from all this? In my mind, a few key things: 1. Many different approaches have worked in changing the world so we should be humble and not assume we are doing The Most Important Thing 2. The case studies we focus on are likely confirmation bias, where
 ·  · 1m read
 · 
I wanted to share a small but important challenge I've encountered as a student engaging with Effective Altruism from a lower-income country (Nigeria), and invite thoughts or suggestions from the community. Recently, I tried to make a one-time donation to one of the EA-aligned charities listed on the Giving What We Can platform. However, I discovered that I could not donate an amount less than $5. While this might seem like a minor limit for many, for someone like me — a student without a steady income or job, $5 is a significant amount. To provide some context: According to Numbeo, the average monthly income of a Nigerian worker is around $130–$150, and students often rely on even less — sometimes just $20–$50 per month for all expenses. For many students here, having $5 "lying around" isn't common at all; it could represent a week's worth of meals or transportation. I personally want to make small, one-time donations whenever I can, rather than commit to a recurring pledge like the 10% Giving What We Can pledge, which isn't feasible for me right now. I also want to encourage members of my local EA group, who are in similar financial situations, to practice giving through small but meaningful donations. In light of this, I would like to: * Recommend that Giving What We Can (and similar platforms) consider allowing smaller minimum donation amounts to make giving more accessible to students and people in lower-income countries. * Suggest that more organizations be added to the platform, to give donors a wider range of causes they can support with their small contributions. Uncertainties: * Are there alternative platforms or methods that allow very small one-time donations to EA-aligned charities? * Is there a reason behind the $5 minimum that I'm unaware of, and could it be adjusted to be more inclusive? I strongly believe that cultivating a habit of giving, even with small amounts, helps build a long-term culture of altruism — and it would