David_Moss

4506Joined Aug 2014

Bio

I am the Principal Research Manager at Rethink Priorities working on, among other things, the EA Survey, Local Groups Survey, and a number of studies on moral psychology, focusing on animal, population ethics and moral weights.

In my academic work, I'm a Research Fellow working on a project on 'epistemic insight' (mixing philosophy, empirical study and policy work) and moral psychology studies, mostly concerned either with effective altruism or metaethics.

I've previously worked for Charity Science in a number of roles and was formerly a trustee of EA London.

Sequences
1

EA Survey 2020

Comments
394

I can confirm that we'll be releasing the results publicly and that there wasn't a 2021 EA Survey. And you can view all the EAS 2020 posts in this sequence.

Many thanks for checking in and sharing the survey! I can confirm that we're leaving the survey open until the end of the year now, like last year. Please also view this post about the extra questions about FTX that have been added now, and the separate survey people can take with questions about FTX, if they have already completed the survey.

Thanks for taking the time to comment!

All of the questions you had substantive comments about the wording of were external requests which we included verbatim.

Re. the order of the dates: when testing, we found some people thought this was more intuitive left to right and some top to bottom (for context, it's not specifically designed as 'grid', it just happens that the columns are similar lengths to the rows). It could be put in a single column, though not without requiring people to scroll to see the full questions, or changing the style so that it doesn't match other questions. Exactly how you see the questions will vary depending on whether you're viewing on PC, phone or tablet though.

[Personal connections] are cited as an important reason that people get involved in EA in the Open Phil EA/Longtermist Survey

  1. 35% of people said personal contact with EAs was important for them getting involved
  2. 38% said personal contacts had the largest influence on their personal ability to have a positive impact

 

I just wanted to check whether you didn’t accidentally cite the OP Survey rather than the EA Survey? These results and question wording are identical to the 2020 EA Survey: 35.4% said “personal contact with EAs” was “important for [them] getting involved in EA”, 38.7% said personal contact with EAs… had the largest influence on your personal ability to have a personal impact?”

It’s true that the OP survey and EAS asked many of the same questions, and it’s true that the OP survey and EAS tend to get exceptionally similar results (when you filter for highly engaged EAs), but that seems like quite the coincidence, and I don't think OP actually asked both these questions.

Fwiw, I think the OP Survey and EAS are both complementary and it's generally good to cite both. Much more could be written about the circumstances in which it makes sense to use different results of each of these surveys, since I think it is not straightforward. I'd like the survey team to do this sometime, but we lack capacity at present.


 

Thanks for the suggestion! We can certainly add something about this to the landing page. [And have now done so]

I would also note that this text is usually also already included where the survey is distributed. i.e. when the survey is distributed through the EA Newsletter or CEA social media, it will go out with a message like "If you think of yourself, however loosely, as an “effective altruist,” please consider taking the survey — even if you’re very new to EA! Every response helps us get a clearer picture" before people see the survey.  That kind of message didn't seem so necessary on the EA Forum announcement, since this is already a relatively highly engaged audience.

Thanks! This is useful feedback.

We'd like to include more questions in the extra credit section, and I agree it would be useful to ask more about the topics you suggest. 

Unfortunately, we don't find that adding more questions to the extra credit section is completely 'free'. Even though it's explicitly optional, we still find people sometimes complain about the length including the optional extra credit section. And there's still a tradeoff in terms of how many people complete all or part of the extra credit section. We'll continue to keep track of how many people complete the survey (and different sections of it) over time to try to optimise the number of extra questions we can include. For example, last year about 50% of respondents started the extra credit section and about 25% finished it.

Notably we do have an opt-in extra survey, sent out some time after the main EA Survey. Previously we've used this to include questions requested by EA (usually academic) researchers, whose questions we couldn't prioritise including in the main survey (even in extra credit). Because this is completely opt-in and separate from the EA Survey, we're more liberal about including more questions, though there are still length constraints. Last year about 60% (900) people opted in to receive this, though a smaller number actually completed the survey when it was sent out. 

We've previously included questions on some of the topics which you mention, though of course not all of them are exact matches:

  • Moral views: We previously asked about normative moral philosophy, metaethics, and population ethics
  • Identification with EA label: up until 2018, we had distinct questions asking whether people could be "described as "an effective altruist"" and whether they "subscribe to the basic ideas behind effective altruism". Now we just have the self-report engagement scale. I agree that more about self-identification with the EA label could be interesting.
  • Best and worst interactions with EA: we've definitely asked about negative interactions or experiences in a number of different questions over the years. We've not asked about best interactions, but we have asked people to name which individuals (if any) have been most helpful to them on their EA journey.
  • Community building preferences: we've asked a few different open-ended questions about ways in which people would like to see the community improve or suggestions for how it could be improved. I agree there's more that would be interesting to do about this.

Thanks for the comments!

Is the paragraph below saying that surveying the general population would not  provide useful information, or is it saying something like 'this would help, but would not totally address the issue'.

It's just describing limitations. In principle, you could definitely update based on representative samples of the general population, but there would still be challenges.

Notably, we have already run a large representative survey (within the US), looking at how many people have heard of EA (for unrelated reasons). It illustrates one of the simple practical limitations of using this approach to estimate the composition of the EA community, rather than just to estimate how many people in the public have heard of EA. 

Even with a sample of n=6000, we still only found around 150 people who plausibly even knew what effective altruism was (and we think this might still have been an over-estimate). Of those, I'd say no more than 1-3 seemed like they might have any real engagement with EA at all. (Incidentally, this is roughly a ratio that seems plausible to me for how many people who hear of EA actually then engaged with EA at all, i.e. 150-50:1 or less.) Note that we weren't trying to see whether people were members of the EA community in this survey, so the above estimate is just based on those who happened to mention enough specifics- like knowing about 80,000 Hours- that it seemed like they might have been at all engaged with EA). So, given that, we'd need truly enormous survey samples to sample a decent number of 'EAs' via this method, and the results would still be limited by the difficulties mentioned above.

Thanks for asking! We would definitely encourage community builders to share it with their groups. Indeed, in previous years, CEA has contacted group organizers directly about this. We would also encourage EAs to share it with others EAs (e.g. on their Facebook page) using the sharing link. I would not be concerned about you 'skewing the results' by sharing and encouraging people to take the survey in this way, in general, so long as you don't go to unusual lengths to encourage people (e.g. multiple reminders, offering additional incentives to complete it etc.). 

Thanks for asking! This is a question requested by another org (in 2019), so I can't give you the definitive authorial intent. But we would interpret this as including virtual personal contact too (not just in-person contact).

Load More