Hide table of contents

Is EA growing? Rather than speculating from anecdotes, I decided to collect some data. This is a continuation of the analysis started last year. For each trend, I collected the raw data and also highlighted in green where the highest point was reached (though this may be different from the period with the largest growth depending on which derivative you are looking at). You can download the raw data behind these tables here.







Implications

This year, I decided to separate growth stats into a few different categories, looking at how growth changes when we talk about people learning about EA through reading; increasing their commitment through joining a newsletter, joining a Facebook group, joining the EA Forum, or subscribing to a podcast; increasing engagement by committing -- self-identifying as EA on the EA Survey and/or taking a pledge; and having an impact by doing something, like donating or changing their careers[33].

When looking at this, it appears that there has been a decline of people searching and looking for EA (at least in the ways we track), with the exception of 80,000 Hours pageviews and EA Reddit page subscriptions which continued to grow but at a much lower pace. When we look at the rate of change, we can see a fairly clear decline across all metrics:

We can also see that when it comes to driving initial EA readership and engagement, 80,000 Hours is very clearly leading the pack while other sources of learning about EA are declining a bit:

In fact, the two sources of learning about EA that seem to best represent natural search -- Google interest and Wikipedia pageviews -- appear somewhat correlated and are now both declining together.

However, there are more people consuming EA in closer ways (what I termed “joining”) -- while growth rate in the EA Newsletter and 80K Newsletter has slowed down, the EA FB is more active, the EA Reddit and total engagement from 80K continues to grow, and new avenues like Vox's Future Perfect and 80K's podcast have opened up. However, this view of growth can change depending on which derivative you look at. Looking at the next derivative makes clear that there was a large explosion of interest in 2017 in the EA Reddit and the EA Newsletter that wasn’t repeated in 2018:

Additionally, Founder's Pledge continues to grow and OFTW has had explosive growth, though GWWC has stalled out a bit. The EA Survey has also recovered from a sluggish 2017 to break records in 2018. Looking at the rate of change shows Founder's Pledge clearly increasing, GWWC decreasing, and OFTW’s having fairly rapid growth in 2018 after a slowdown in 2017.


Lastly, the part we care about most seems to be doing the strongest -- while tracking the actual impact of the EA movement is really hard and very sensitive to outliers, nearly every doing/impact metric we do track was at its strongest in either 2017 or 2018, with only GiveWell and 80K seeing a slight decline in 2018 relative to 2017. However, looking at the actual rate of change shows a bleaker picture that we may be approaching a plateau.


Conclusion

Like last year, it still remains a bit difficult to infer broad trends given that a decline for one year might be the start of a true plateau or decline (as appears to be the case for GWWC) or may just be a one-time blip prior to a bounce back (as appears to be the case for the EA Survey[34]).

Overall, the decline in people first discovering EA (reading) and the growth of donations / career changes (doing) makes sense, as it is likely the result of the intentional effort across several groups and individuals in EA over the past few years to focus on high-fidelity messaging and growing the impact of pre-existing EAs and deliberate decisions to stop mass marketing, Facebook advertising, etc. The hope is that while this may bring in fewer total people, the people it does bring in will be much higher quality on average. Based on this, while EA is maybe not growing as fast as it could if we optimized for short-run growth, I’d provisionally conclude that EA is growing in exactly the one would expect and intend for it to do so. Additionally, clear growth in pledges and money raised from Founders Pledge, Effective Giving, and One For The World show a potentially new promising path for future growth that could lead to many more donations in the future.

However, I don’t think we should take this at face value to assume the EA movement is safe from decline -- if fewer people are initially discovering EA, this could lead to much slower or reduced growth in impact a few years down the line as fewer people are growing the overall pie of EAs who can be counted on to have an impact in later years. Indeed, looking at the rate of change in these metrics shows a bleaker picture, with EA having gone through a critical acceleration period that has now mostly ended, potentially bringing about a future plateau in some of these statistics.

We’ll continue to monitor and see if these trends hold up in future years or if there emerge causes for concern. Also please feel free to mention other metrics we should consider adding for next year[35].


Corrections

3 June 2019 - Founders Pledge information was originally given in thousands when it should be in millions. Additionally, totals given were slightly off (especially for 2015) and have now been corrected. This has now been corrected in the table, graphs, and downloadable CSV. This made the rate of change for Founders Pledge to be clearly positive and as such I added a bit of optimism to the conclusion. Additionally, I amended F18 to attribute Callum for new data and explain lumpiness in the estimations.

3 June 2019 - FN13 specifying the EA Newsletter was updated with info from Aaron's comment. I also updated the conclusion slightly to explicitly mention discontinuing advertising campaigns as a deliberate reason for lower growth.

4 June 2019 - The highlighting for GiveWell's monthly unique visitors (excluding adwords) incorrectly identified 2016 as the year with the most visitors. That has been corrected to show 2015 as the year with the most visitors. (The underlying data was not wrong, just the highlighting.)

4 June 2019 - Corrected stats for the 80,000 Hours podcast using new data from Rob.

12 July 2019 - Fixed a typo in FN35 and FN34.


Endnotes

[1]: See Google Trends data. These numbers are not search volumes -- they’re the mean relative “score” for that year, relative to the search volume for the highest day between January 2004 and the end of December 2018. Each volume number is the number as of the last day of December of the reported year.

[2]: See WikipediaViews.org data. These are desktop pageviews for the “Effective altruism” Wikipedia article. See some more wiki data here and here.

[3]: From 80K stats collected here with modifications from Rob's comment. Note that the 80,000 Hours podcast didn't start until halfway through 2017.

[4]: Data from 2016 and earlier was collected by Vipul Naik. Data for 2017 and after is available but I am told that it would take too long to collect, so in the interest of publishing this post in a remotely timely manner, I will save collecting this data to next year.

[5] Vipul’s data only has data starting mid-September 2014, so it seems most accurate to not count this year.

[6]: These data come from the moderator panel for the Reddit. I was able to collect these data as I am a moderator. This panel is unfortunately not accessible to non-moderators. It also unfortunately only goes back one year at a time.

[7]: Due to the limitations of the Reddit moderator panel only going back one year at a time, I have to use old data that is of the time range September 2016 to August 2017 as the estimate for 2017.

[8]: These data is only slightly off - it actually represents Jan 5, 2018 to Jan 4, 2019.

[9]: These data comes from asking Jon Behar.

[10]: From GiveWell’s metrics spreadsheet.

[11]: These data comes from asking Kelsey Piper, Vox Future Perfect staff writer.

[12]: Both r/EffectiveAltruism and r/smartgiving have been simultaneous EA subreddits since September 2012. r/smartgiving was the default EA subreddit until an intentional migration on 28 Feb 2016. I will use r/smartgiving numbers for the 2014-2015 period and r/effectivealtruism numbers for all periods after that, to reflect the transition. Note that this growth will therefore involve some inherent double-counting as people who were subscribed on r/smartgiving re-subscribe on r/effectivealtruism. Pageviews for reddit were calculated via http://redditmetrics.com/.

[13]: Data after 2016 comes from asking Aaron Gertler and is more reliable. Data from before 2016 comes from accessing archived data from Rethink Charity that is much more approximate. It should be noted that heavy growth in 2017 came from a heavy Facebook advertising campaign that was not continued into 2018.

[14]: As I’m a moderator of the EA Facebook group, I was able to collect these data from the moderator panel that comes with Facebook. This panel is unfortunately not accessible to non-moderators. Unfortunately, I only have data going back to July 2017, where there were 8629 active users. At the end of 2018, there were 9104 active users.

[15]: Data was only available going back to July 2017, so I fudge here by just doubling the number.

[16]: These data come from asking Julia Wise.

[17]: These data comes from Steve Hindi. Note that these data are for school years (thus the “2014” period here represents July 2013 to Jun 2014, etc.).

[18] These data come from asking Marie Paglinghi and Callum Calvert. Note that pledge totals may be a bit jumpy as they can be sensitive to small changes in larger donors.

[19]: Represents the total 2014 donations, as recorded in the 2015 EA Survey.

[20]: Both 2015 and 2016 donations were recorded as of the 2017 EA Survey (as no EA Survey was run in 2016). This means that 2015 donations could be artificially low due to survivorship bias, if some donors didn’t fill out the EA survey two years later. (Not to mention the fact that there are likely many donors who don’t fill out the EA survey even one month later.)

[21]: These data will be recorded in the forthcoming 2019 EA Survey.

[22]: From GiveWell's 2015 metrics report

[23]: From GiveWell’s 2016 metrics report

[24]: From GiveWell’s 2017 metrics report

[25]: From GiveWell’s 2014 Top Charity Report

[26]: $44.4M for top charities (see GiveWell 2015 Top Charity Report) plus a special $25M donation to scaling GiveDirectly.

[27]: From GiveWell’s 2016 Top Charity Report

[28]: This data collected via Vipul Naik. Data is preliminary and has not been completely vetted and normalized. Money from the Open Philanthropy Project is counted for the year in which the grant is announced, which may be different from the year the grant is decided or the year the grant money is actually dispersed. Note that this might make 2018 artificially lower, as some 2018 grants may not yet be announced (or may have been announced but not recorded) as of the time of this writing.

[29]: Note that, according to Julia, 2017 was the last year when a staff member sent repeated emails to people reminding them to record their donations and Julia suspects the lower number in 2018 is a result of that. Julia notes that in Spring 2019, they will resume recontacting people, so they will see if this increases reported 2018 donations.

[30]: From https://animalcharityevaluators.org/about/impact/giving-metrics/

[31]: From https://app.effectivealtruism.org/funds

[32]: This value is not finalized yet.

[33]: Note that this is separation mainly for illustrative purposes. While it may be tempting to arrange this into some sort of EA funnel, it is not quite that as we don’t have any evidence to back up this categorization. In fact, reading the EA Forum may actually signify fairly deep engagement despite being reading, whereas being in the EA Survey panel is seen as committing but could just be a person who reads the EA Newsletter. Getting better metrics on EA engagement as well as putting more effort into figuring out what the EA Funnel may actually empirically consist of is an ongoing project of the EA Survey.

[34]: However, it appears that the large growth in 2018 EA Survey takers was primarily driven by more people taking the EA Survey from the EA Newsletter, where the EA Survey was placed a lot more prominently (receiving a dedicated email) than in prior years and where the EA Newsletter itself had just undergone large growth the year before (between the 2017 and 2018 EA Surveys). This would suggest that the EA Survey growth might stagnate or decline in the future as sources of people finding out about the EA Survey also stagnate.

[35]: We’re considering adding data around growth at pledges secured by Effective Giving, total pledge counts (not money) from Founders Pledge, total OpenPhil grants (amount + number of grants, and to more than just GiveWell), growth at the EA Hub, growth in traffic at effectivealtruism.org, sales of various EA books (e.g., Doing Good Better), metrics for the EA Forum (such as page views, total accounts, active users, total posts, total comments, total upvotes), and metrics around local groups.


Credits

This essay is a project of Rethink Priorities. It was written by Peter Hurford with graphs by Neil Dullaghan. Thanks to David Moss, Neil Dullaghan, Michael Trzesimiech, and Marcus A. Davis for comments. Thanks to Michael Trzesimiech for compiling the table into a downloadable CSV. Also, additional thanks to everyone who helped provide the underlying data collected for this post. If you like our work, please consider subscribing to our newsletter. You can see all our work to date here.

Comments40
Sorted by Click to highlight new comments since: Today at 10:52 AM

It's worth keeping in mind that some of these rows are 10 or 100 times more important than others.

The most important to me personally is Open Phil's grantmaking. I hadn't realised that the value of their non-GiveWell grants had declined from 2017 to 2018.

Fortunately if they keep up the pace they've set in 2019 so far, they'll hit a new record of $250m this year. In my personal opinion that alone will probably account for a large fraction of the flow of impact effective altruism is having right now.

Regarding the Effective Altruism FB group member growth over time, I was able to piece together the following graph using archived snapshots and various other sources: https://i.imgur.com/Lejj0e1.png

The raw data (including sources for each data point) is available here. If anyone has more comprehensive data, please let me know.

Based on that, I estimate (linearly interpolate) the following member counts for January 1 of each year:

  • 2014: ~1905
  • 2015: ~4535
  • 2016: ~8610
  • 2017: ~11983
  • 2018: ~14119
  • 2019: ~16003

You have the actual data for 2018 and 2019. If you could share the correct counts for January 1 of those years it would be nice.

This is the data moderators of the main FB group have collected about number of members over time: https://docs.google.com/spreadsheets/d/1pOiVv6q2dW6IcEHvGxp4TLKjUtOVkxX6GjqZ_91Vv7E/edit?usp=sharing

Wow, can you speak a bit more to how you recovered that data? That's impressive.

From my actual stats, It looks like there were 14,398 users at the beginning of 2018 and 15,294 users at the beginning of 2019 which matches your equation pretty well.

I think that the 2018-12-05 datapoint is wrong because it came from a Quora answer which was later edited. I can't prove this, but it seems likely because the rest of the datapoints are monotonically increasing.

Most of the other data came from FB itself (as documented in the 'raw data' link above), so it should be pretty solid.

Hi Peter, thanks for such a detailed post. I think there could be a misunderstanding in Founders Pledge numbers as their pledge numbers are growing very quickly and significantly higher than in 2015. (They have also been making great effort in increasing the impact of deployment including into far future.)

Also in the commitment section there is Effective Giving which is a relatively new and making significant progress.

These could point to a somewhat different picture in the commitment section to the one described above.

Perhaps we can review this and provide an update if necessary?

Hi Luke. I reached out to Marie at Founders Pledge and found that the numbers I originally used were meant to be in the millions, not thousands. I have corrected the post and sent it to Marie to review again to triple check.

I agree that the growth at Founders Pledge, OFTW, and Effective Giving sounds impressive and I'll make a note to follow up next year to see if that changes the narrative.

Just wanted to add that this has now been fixed.

Thanks for this, really interesting.

It might be useful to include page views of ea.org in future, given that that's arguably the page that has been most designed to be a good landing page for EA.

Assuming you mean https://www.effectivealtruism.org

http://www.ea.org is something unrelated to EA.

Thanks - I can follow up and ask about that for next year.

Regarding EA Newsletter statistics: I didn't see this mentioned in the piece, but the heavy growth in late 2016 and early 2017 mostly happened because the team who were working on the Newsletter at the time advertised it heavily on Facebook. After the 2017 campaign ended, there wasn't any further advertising (as far as I'm aware).

The number of subscribers roughly tripled during this time, which translated to a 1.5-2x increase in the number of people opening the emails and clicking links (since new subscribers from FB ads weren't as interested in the content). I don't know whether the campaign stopped because the ads stopped working as well, or for some other reason.

I'm considering trying another advertising campaign at some point, but over the last few months, my focus (with the time I actually have to spend on newsletter work) has been on improving the quality of our content and improving our measurement (both for objective open-rate data and subjective "what's our impact?" data).

Anyway, the Newsletter's patterns of growth were heavily "hacked" (albeit not in a bad way) and shouldn't be taken as a measure of organic interest in effective altruism.

Thanks. I edited the post a bit more to mentioned that.

When I asked about what has caused EA movement growth to slow down, people answered it seemed likeliest EA made the choice to focus on fidelity instead of rapid growth. That is a vague response. What I took it to mean is:

  • EA as a community, led by EA-aligned organizations, chose to switch from prioritizing rapid growth to fidelity.
  • That this was a choice means that, presumably, EA could have continued to rapidly grow.
  • No limiting factor has been identified that is outside the control of the EA community regarding its growth.
  • EA could make the choice to grow once again.

Given this, nobody has identified a reason why EA can't just grow again if we so choose as a community by redirecting resources at our disposal. Since the outcome is under our control and subject to change, I think it's unwarranted to characterize the future as 'bleak'.

Great work Peter, thanks so much for doing this! Super helpful to be able to see all these numbers aggregated in the same place. And I love the categorization of the metrics. Strong upvote.

A couple of thoughts on metrics to include next year:

· I agree with Michelle’s comment that traffic for EA.org is an important metric to look at, especially since that’s the top result when people google EA. I’d be interested in both organic search web traffic, and overall traffic (ex paid traffic).

· In general, I think it’s most helpful to look at numbers excluding paid traffic to give a better sense of organic growth rates. As Aaron notes, this helps explain the EA newsletter trajectory, and it’d be interesting to see how excluding paid traffic might affect the 80k traffic numbers as well.

· Total operational spending by EA orgs could be a helpful perspective on how inputs to EA are changing over time; the current metrics are all focused on outputs, and it would be nice to relate the two.

Quick answer for 80k: Paid traffic only comes from our free Google Adwords, which is a fixed budget each month. Over the last year, about 12% of the traffic was paid, roughly 10,000-20,000 users per month. This isn't driving growth because the budget isn't growing.

Thanks for clarifying Ben!

In general, I think it’s most helpful to look at numbers excluding paid traffic to give a better sense of organic growth rates. As Aaron notes, this helps explain the EA newsletter trajectory, and it’d be interesting to see how excluding paid traffic might affect the 80k traffic numbers as well.

I think this is a good idea but I'm unsure if this data is available. I can try to make a better effort to find more data sources that do not include paid traffic for next year.

Total operational spending by EA orgs could be a helpful perspective on how inputs to EA are changing over time; the current metrics are all focused on outputs, and it would be nice to relate the two.

This is a metric I actually put some effort into collecting for this year - I tried to make a basket of orgs that have been around since ~2014 and made publicly accessible budgets, but it proved very time consuming to collect and I felt like it was potentially misleading due to orgs being excluded or orgs not publishing clear budgets.

My hope is that including "EA Funds payouts" could help capture some of this. One thing I really ought to have included but for some reason didn't think to is also total OpenPhil grants (not just to GiveWell or excluding GiveWell) as this may also capture some of the growth in the broader EA space.

Re: non-paid traffic, it should be very easy (a few minutes) to pull traffic data ex-adwords for any sites that are set up on Google Analytics. Excluding other types of paid traffic/conversions (e.g. newsletter signups driven by FB ads) would be harder (though generally doable).

One thing I really ought to have included but for some reason didn't think to is also total OpenPhil grants (not just to GiveWell or excluding GiveWell) as this may also capture some of the growth in the broader EA space.

Agree OpenPhil grants would be a helpful perspective on this, both total grants and grants within their EA focus area (which would be a proxy for meta investment)

Thanks for producing this Peter, it's very helpful. I sent you some metric data on the 80,000 Hours Podcast, but now that I've seen the post, I can give you the best numbers for the table. I would suggest putting these figures in instead.

Net new podcast subscribers added

2017 - 4,600

2018 - 10,500

Total podcast downloads/plays

2017 - 87,600 (average episode length 1.61 hours.)

2018 - 517,100 (average episode length 1.97 hours.)

Notes on interpretation

  1. The podcast only started half way through 2017, I'm not sure how you want to handle that.

  2. Those are the maximum number of subscribers recorded at any point in the year. It's probably a few % too high in both cases, but I've found that's the measure most robust to random measurement variations. The overestimation should also be constant year to year.

  3. Podcast downloads/plays don't correspond to actual times people listened to a full episode. They include people pressing play but only listening to a few seconds; bots downloading the show; automatic downloads by the podcasting software that are never actually listened to; and so on. So they're massive overestimates of the number of times an episode was listened to, say, half way to completion. However, the overestimate is likely to be a pretty constant fraction year-to-year, so you can still make relative comparisons.

    — RW

How big do you expect that fraction to be? (Or: what percentage of those numbers do you expect to be 'real listeners'?)

Thanks - I updated the post with that data.

[EA Forum traffic] data for 2017 and after is available but I am told that it would take too long to collect, so in the interest of publishing this post in a remotely timely manner, I will save collecting this data to next year.

I’m very surprised that pulling this data is non-trivial; I would have guessed it would take <5 minutes to get from Google Analytics. Is the EA Forum still set up on Google Analytics (which is where the pre-2017 data came from)? If not, why not, and how do those managing the platform measure usage and engagement?

I don't know - I wasn't given any details on why it was a difficult to fulfill request. (Edited to add: I also didn't ask for any details.)

The prior version of the Forum (before ownership was transferred to CEA) wasn't set up for Analytics. We began tracking this data one month ago. The Forum currently gets ~3000 pageviews per day from ~1000 users.

Other data I could have provided to Peter if he had asked: New posts and comments are appearing at a much faster rate than before the new Forum launched.

For example, there were 26 posts in the month of September and 71 posts in February (not counting posts written by CEA staff). The average number of comments per post doesn't seem to have changed much, but the most commented-upon posts get many more comments than they used to (at least two posts in the last two months got more than 100, more than double the count of any post in September). The "71" number is pretty typical -- we've been getting between two and three new posts per day in almost every week since the new Forum launched in November.

I haven't yet taught myself enough SQL to get these numbers from our database, but I found them within 10 minutes by clicking "load more posts" a bunch of times on the "All Posts" page. If anyone else does the same, they'll also see the dramatic uptick in new posts after the new Forum launched. (Though this doesn't mean that EA is more popular now -- the new interface and extra promotion are probably most of the reason we've seen more posts, rather than an increase in the number of people who care about the movement.)

--

Peter didn't request this kind of data; he only asked for "page views and [the] number of newly created accounts".

As far as I know, we don't have Analytics data for page views. Getting account creation data back to 2014 should be doable, but we didn't have time to get to that request before Peter published the post.

We are interested in this data for our own purposes and may collect it at some point; I offered to let Peter know if we did so he could update the post, and he replied positively to the offer.

I generated a graph of the number of EA Forum posts per year, as well as the number of new user registrations. I extracted the data using the GraphQL API.

The raw JSON data for all posts is here. I had to split the user data into two files due to upload limits. The raw JSON data for all unbanned (but otherwise unfiltered) users is here. The JSON data for all banned users is here.

Results:

  • Posts by month; Posts by year
    • (The post count includes "meta" posts.)
    • 2011: 1
    • 2012: 7
    • 2013: 66
    • 2014: 234
    • 2015: 460
    • 2016: 296
    • 2017: 285
    • 2018: 442
    • 2019 (so far): 433
  • New user registrations by month; New user registrations by year. If you include banned users, you get this monstrosity.
    • I divided the users into four categories in order to try to make the numbers more useful. The year entries below list the frequency values in this order.
      • unbanned users with non-zero posts/comments and non-negative karma.
      • unbanned users with non-zero posts/comments and negative karma
      • unbanned users with zero posts/comments
      • banned users
    • That said, the database's indices for number of posts and number of comments don't seem to reflect reality perfectly. Take a quick look at this table for what I mean.
    • Note: A previous version of this comment had different values, because I mistakenly was ignoring users with null-valued karma in the database. Now I just treat them as if they had zero karma.
    • 2014: 336; 7; 186; 1
    • 2015: 284; 9; 538; 1
    • 2016: 188; 36; 540; 3
    • 2017: 224; 70; 617; 7
    • 2018: 366; 43; 746; 21
    • 2019 (so far): 382; 22; 652; 1750

Thanks, this is really cool to see. I will follow up next year to add total posts as a metric. I added this idea to FN35.

Thanks. By the way, I updated the comment you replied to.

Thanks Aaron, very helpful data and context.

Vipul wrote up observations on Forum traffic from Sep 14-Dec 16, which seems to be based on data from Google Analytics. Any way to splice this data together with the more recent history?

FYI, GiveWell's money moved in 2018 ex Good Ventures was ~$65 million, which makes the donation numbers look somewhat better.

Thanks - I'll be sure to include that number for next year.

I think an aspect of this may be the dearth of continuously updating social media. I have been reading about EA for some time but really went down the Internet rabbit hole on it today and looked for ways to remain engaged. I tried to find some Instagram pages to follow, for instance, but did not easily find any (or any links should they exist) for 80k, EA, or GWWC that either existed or had been continuously updated since 2016. I think this is a subject that should still be generating more interest and has a huge potential to, but it seems to be a bit disconnected from the mainstream in ways that could be remedied by a more effective social media strategy.

I'm interested in this question. I've made the decision to quit most social media, and instagram isn't very popular in my friendgroup anyway. But I live in Silicon Valley, and I know people's habits change fast. I really buy the thesis that there's a large demographic of people who'd be interested in EA but would be much more likely to hear about it if they could discover / remain engaged with it on instagram.

Still, I'm not sure what it would look like. In my mind EA messages do poorly when they're forced to fit into images. The arguments for EA don't need to be wrapped in dense philosophy papers, but given the difficulty of conveying the ideas with fidelity, I don't have high confidence that an instagram page would do a good job of it.

Edit: Wait, I think I got this tone wrong. I do want to state some level of skepticism coming in, but I'm genuinely really interested in responses to my skepticism and in ways of doing a good EA-aligned instagram account.

Two interesting questions that bounce of this are "how many members does EA have?" (obviously this is somewhat vague) and "how many members would be optimal?" (more members has clear benefits, but it's presumably possible to get too big). From e.g. Facebook group membership and survey responses, it seems like the answer to the first question is somewhere in the 1000-10000 range. I'm not sure what the best points of comparison with regard to the second question are, but the Extinction Rebellion movement, most major British political parties, and Scientology all have significantly more members.

To get a better sense of “how many members does EA have?”, going forward I suggest asking organizations for data on unique website visitors rather than pageviews since the latter somewhat conflates number of people and degree of engagement (pages per session).

[anonymous]5y1
0
0

Great work! I wonder if there are any ways to track quality adjusted engagement since that what we've mostly been optimizing for the last few years. E. g. if low-quality page views/joins/listeners are going down it seems hard to compensate with an equal number of high quality ones because they're harder to create. 80k's impact adjusted plan changes metric is the only suitable metric I can think of.

Impact adjusting is fairly values sensitive and may differ dramatically even between EAs, which is why I'd prefer to report raw data and let other people attempt their own impact adjustments on top of it.

I don't think you get enough information from pageview tracking to be able to impact adjust each pageview, but perhaps you could track engagement hours (as 80K does) or engagement on particular target pages. Additionally you could impact adjust based on source (e.g., weighing growth in 80K pageviews higher than growth in Future Perfect pageviews).

The pledge counts and donation totals also likely lend themselves to impact adjusting fairly well, as you could impact adjust based on charities donated to (most of the raw data to do this is available) or the kind of pledge taken.

Thanks for this post.

One question: Was the last sentence of footnote 35 meant to be the last sentence of footnote 34? The sentence is "This would suggest that the EA Survey growth might stagnate or decline in the future as sources of people finding out about the EA Survey also stagnate", and the survey is the focus of footnote 34, but not of footnote 35.

Yep! Fixed! Thanks!