Hide table of contents

This year I was in charge of marketing for EA Global. This gave me some useful insights into the EA community and why it's growing rapidly. I'm planning to do a full write-up in the future, but two insights are worth sharing now.

More than half of EA Global applications were due to referrals

Of the 2,152 applications that we received for EA Global this year, around 55% of them applied because of someone else in EA. This includes 689 applications through the nomination system and 486 that were caused by "someone involved in EA." As someone who spent the list 3+ months of my life trying to get applications for EA Global, it's very interesting that the EA community itself is still the best marketing tool we have.

EA's are willing to tell others about EA

As part of registration, we asked the following question:

How likely is it that you would recommend Effective Altruism to a friend or colleague?

This is the so-called Net Promoter Score (NPS) question popularized by Bain and Company in the early 2000s. NPS is a popular method of measuring customer loyalty. Answers are given on a 0 to 10 scale with 0 being "not at all likely" and 10 being "extremely likely." A brand's score is determined by taking the percentage of 9s or 10s (promoters) minus the percentage of 6s or below (detractors) while ignoring 7s and 8s (neutrals). A score of +50 is considered excellent. Apple scores in the high 60s to low 70s.

Out of 438 respondents, our average score was 8.53 with an NPS of +45. 42% of people gave EA a 10 out of 10. This is in line with a similar question asked of EAGx attendees which showed an NPS of +45 across 52 respondents.

NPS has a number of flaws (read the Wikipedia page for details) so I wouldn't take it too seriously. Yet, the NPS score coupled with the referral evidence suggests that as a brand EA has a very dedicated fanbase that is willing to promote the brand.

Potential implications

We might be able to draw a few implications from this:

Being More Welcoming

One popular topic of conversation is how to make EA more welcoming. And, of course, we should strive to be more welcoming. Yet, if EA is already building a highly loyal community willing to promote EA, then perhaps this isn't one of the most pressing problems facing EA.

EA is too cultish

One reaction that people sometimes have to EA is that it seems "cultish." Seeming cultish is clearly a bug and something we should fix. Yet, the perception probably results from a combination of high brand loyalty and a set of ideas that sometimes result in radical life change. Put this way, it seems more like a feature than a bug. Indeed, some of the best brands have "cultish" following (Apple comes to mind).

 

 

Edit: Howie's post below has updated me to think that we can't infer much about the welcominess of EA or whether EA is too cultish on the basis of the data here. At best we might be able to conclude that EAs think their friends and colleagues will like the community, but this doesn't tell us much of what they'll think of the community once they've interacted with it.

 

2

0
0

Reactions

0
0

More posts like this

Comments38


Sorted by Click to highlight new comments since:

I think the tone of this post projected a confidence around your empirical views that initially led me to glaze over the actual data you present. But on a second read I noticed that the NPS at EAG was 5 points lower than the threshold you give for "excellent" (and far short of what you say is typical of Apple). This felt a bit jarring in light of the takeaways that "EAs really love EA" - so much so that being more welcoming isn't a very pressing problem.

Nothing in this post is actually inconsistent with an NPS that's short of excellent. I don't even really have an opinion about whether NPS is a useful measure. But it does make me feel like the "potential implications" you list are things you already believed. Did the data affect your views much one way or another? Do you have a sense for the threshold at which you would have instead written a post that said "even though the NPS at EAG was only good, not great, I still believe that making EA more welcoming is not one of the most pressing problems facing EA?"

Similarly, I don't really see how either of your conclusions could really be a "potential implication" of the fact that more than half of EAG applications were due to referrals. To me this data seems equally consistent with the exact opposite conclusions - only people who already know an EA well end up applying to EAG, which is evidence that EAG is unwelcoming and too cultish. Alternatively, if very few applications had come through referrals, say 5%, you could just as easily argue that this was evidence against the need to be more welcoming (tons of people who don't know us feel welcome to apply) and against EA being too cultish (it's not a closed system at all)!

Obviously it's ok for you to have views that aren't driven by this fairly limited data. But if the data isn't really one of the stronger factors informing your views, I think I'd probably prefer to see them presented in separate posts. I think there's otherwise a risk of building a habit of using "data as soldiers" (https://wiki.lesswrong.com/wiki/Arguments_as_soldiers) and losing opportunities to update.

[anonymous]3
0
0

I think the tone of this post projected a confidence around your empirical views that initially led me to glaze over the actual data you present.

This wasn't the tone I was going for. On my reading of the post, it's pretty hedge-y, with the exception of the title. Can you help me out by pointing out some ways that I seem overconfident in the empirical view?

To me this data seems equally consistent with the exact opposite conclusions - only people who already know an EA well end up applying to EAG, which is evidence that EAG is unwelcoming and too cultish.

I don't see why this is a reasonable conclusion to reach. I think the welcoming/unwelcoming distinction is a claim about the experience of being in the community and interacting with EAs. Since new people haven't had a chance to interact with EAs, it would be surprising if they found the community unwelcoming.

It could be that the EA brand prevents people from wanting to join the community in the first place. That seems like a hypothesis worth testing, but it doesn't seem to me like a claim about how welcoming the community is.

That said, I am inferring from "EAs think their friends will like EA" to "therefore, the community is welcoming." I agree that there are a number of ways that this inference could break down and I probably should have stuck more to the first claim instead of the second.

I added an edit to the post to reflect this concern.

Obviously it's ok for you to have views that aren't driven by this fairly limited data. But if the data isn't really one of the stronger factors informing your views, I think I'd probably prefer to see them presented in separate posts.

This is fair. The data here does affect my view, but the view is also affected by some other data we collected around EA Global. I'm not sure a separate post would be the way to go, but a more nuanced discussion of whether this implies anything about the welcominness of the community would have been better.

I think welcoming/unwelcoming is one of those things that most people initially assess almost immediately upon contact with a community. Yes, people who stay in the community will update their perceptions over time, but I have definitely been to enough meetups, dances, and general social gatherings to have a sense after one interaction with a community of whether I feel welcome and to have noticed that this affects my probability of returning. It even affects my probability of returning if I go with a friend, or know a subset of people there; being welcomed by one or a few people is not necessarily enough if the community as a whole doesn't feel welcoming, even if those people are deeply connected in the community and think I will want to return.

I also think that what it means for long-time community members to feel like a group is welcoming is pretty unclear, because they themselves actually do not need to be made welcome there. I would be much more interested in thinking about whether EA is welcoming based on how welcoming newcomers (say, people who have participated in local or online EA groups at least once, but for no longer than a year) think it is, what proportion of newcomers return or bring friends, etc.

Thanks for being so open to feedback and non-defensive on this and thanks especially for updating the body of the post. I think there's a big problem where people change their minds due to feedback but it never gets propagated out to readers b/c it's buried in a comment section. With preliminaries out of the way. . .

This wasn't the tone I was going for. On my reading of the post, it's pretty hedge-y, with the exception of the title. Can you help me out by pointing out some ways that I seem overconfident in the empirical view?

Looking back at it, I do think the headline really primed me here. But I think the other things were: "the NPS score coupled with the referral evidence suggests that as a brand EA has a very dedicated fanbase that is willing to promote the brand." I see how this could feel like a nitpick because you said "suggests" and not "proves" but I think somebody reading the post quickly, glossing over the actual numbers, and trusting your description of their implications (which I think is pretty common/reasonable) would take this to mean that the NPS score unambiguously is evidence in this direction and the uncertainty is due to the fact that it's just one form of evidence. I'm not sure exactly what I would've said instead but I would've said something more moderate when describing data that's less than excellent on it's own terms. Taking out the phrase "very dedicated" would've made a pretty big difference, I think. I think it would've made a big difference if there was some explicit discussion of the fact that the NPS score was good-but-not-great and could reasonably have been better. Putting the data in there for comparison definitely helps but if I'm reading something about metrics I've never seen before, I kind of expect the writer to do the work for me and tell me how to interpret the comparison. If there's some data and then the author says it suggests "very dedicated fanbase" I'm likely to assume that the score EA got is relatively close to the score Apple got. If it's not, that seems like an important enough fact to grapple with instead of just present. *The other places you implicitly described what the data mean are "highly loyal community willing to promote EA" and "high brand loyalty." Combining all of this, I think the post really reads like EA killed it on the NPS front.

I don't see why this is a reasonable conclusion to reach. I think the welcoming/unwelcoming distinction is a claim about the experience of being in the community and interacting with EAs. Since new people haven't had a chance to interact with EAs, it would be surprising if they found the community unwelcoming. It could be that the EA brand prevents people from wanting to join the community in the first place. That seems like a hypothesis worth testing, but it doesn't seem to me like a claim about how welcoming the community is. Fair points but I’m not convinced.

I’d guess that there’s a lot of people who have had enough contact with the EA community to have been affected by its welcomingness or lack thereof but who wouldn’t have counted as applying to EAG b/c of someone else in EA according to your metric. People with relatively weak connections to EA are likely to be most affected by welcomingness so it seems possible that the relevant margin here is whether people who fall into the non-referral group feel welcome to apply. I think a major mechanism through which welcomingness has effects is welcomingness -> experience of people interacting with the community -> EA’s reputation/brand among people outside the community. So I’d actually expect welcomingness to have a big effect on whether EA has a brand that gets people to want to join the community in the first place. For a fairly big, somewhat outward facing event like EAG where the pool of potential non-EA applicants is so large compared to the pool of potential EA applicants, it seems possible that this mechanism through which welcomingness decreases proportion of applicants coming through referrals could dominate your proposed mechanism through which welcomingness increases the proportion of applicants coming through referrals. JTBC, I don’t have a net take on the above. My main point is just that the direction is ambiguous so I don’t think the data says much about welcomingness.

In fact, the team most likely to be growing EA, the Effective Altruism Outreach team was cautioning against growth. It seems reasonably clear that EA is growing virally and organically -- exactly what you want in the early days of a project.

Why do you want a project to grow virally and organically in the early days of a project? That seems like the opposite of what I'd guess; when a project is young you want to steer it thoughtfully and deliberately and encourage it to grow slowly, so that it doesn't get off track or hijacked, and so you have time to onboard and build capacity in the new members. Has the EAO team come to think that fast growth is good?

[anonymous]1
0
0

Fair point. I've deleted that section as a result.

The idea in my head is that it's better to be getting growth because people really value and want to share your product than it is to only be able to get growth through direct marketing.

My view on growth is that as our tools to onboard new people improve we'll want to grow faster. The tools aren't excellent now, but I'm optimistic that we will develop some better material soon.

"... of course, we should strive to be more welcoming. Yet, if EA is already building a highly loyal community willing to promote EA, then perhaps this isn't one of the most pressing problems facing EA."

It seems to me that EA is great at getting people who are very much like existing EAs, which leads to the risk of ossification of "types of people in EA" (in other words, a lack of diversity). I think being more welcoming is important to avoid this. (If I remember correctly, you agreed with this point this last year, have you changed your mind on this? If so, how?)

Agreed, and though this good evidence about people in EA having a positive experience, it has almost no chance of detecting the people who don't, since participation is conditional on 1.) The subjects choosing to invest significant time and money in attending EAG & 2.) The subjects' applications being approved to attend the conference by the organisers.

I'm not meaning to suggest that the application process was actively weeding out negative people, but pointing out there are a number of significant selective processes before people were asked this question. For that reason it's got limited power to detect anybody who doesn't have a positive experience of EA, and shouldn't be used as evidence of no problem.

It is good to hear about positive experiences though, so thanks for sharing it.

[anonymous]2
0
0

Agree 100%.

[anonymous]1
0
0

My guess is that if EA is growing via people telling their friends, then being generally welcoming probably isn't the problem. Maybe we're very welcoming to some people and not at all welcoming to others, but presumably, similar behaviors work with a wide range of people.

An alternative explanation is that the demographic makeup of EA is largely caused by founder effects. EA grows through referrals, and people tend to know other people who are similar to themselves. So, absent some external pressure, we should expect EA to grow to be similar in demographic makeup to the way it was at its founding. If this is true, then EAs being more welcoming probably won't solve the problem (even if it's a good thing to do for other reasons).

At EA Berkeley, we have members who came through people telling their friends, and people who we got essentially by picking students at Berkeley near-randomly (but selecting for altruistic tendencies). (TL;DR: These students get involved either through a speed Giving Game or by seeing an ad for our course on EA, rather than through social networks.)

The members who came from people telling their friends fit the standard EA demographic, as you might expect -- mostly male, and all STEM (and mostly Comp Sci). Let's call them group A.

Meanwhile, the students who we got near-randomly, selecting for altruistic tendencies, are more likely to be female and have essentially no specific major. Let's call them group B.

Now, group A forms a much larger percentage of EA Berkeley, so they tend to determine topics of conversation and group activities. We've talked about NP-completeness, video games, the popularity of certain CS professors, etc. These are great topics of conversation for group A, even if the individual members don't know each other. Note that they have nothing to do with EA. However, there are usually one or two members of group B at these events, and inevitably they are quiet and don't interact much with everyone else.

With that sort of dynamic, it's not surprising to me at all that we tend not to get very many active members from group B. When I say "being more welcoming", I mean not talking about NP-completeness, video games and popularity of CS professors, and instead talking about topics that everyone could be a part of. This seems like an important change that will directly lead to more retention of members in group B, which in turn seems important to me because it increases the diversity of EA.

Note that "EA is already building a highly loyal community willing to promote EA" is perfectly compatible with this view -- that loyal community is group A.

Also, as a result I strongly disagree with "presumably, similar behaviors work with a wide range of people" -- the behavior of "talking about NP-completeness, etc." works great with CS majors but not others.

I mostly agree with your last paragraph, except that I think "being more welcoming" is the external pressure that would help EA become more diverse (through the mechanism outlined above).

I would like to offer a contrasting view point from our experiences at EA McGill. Our members seem to be more often in subjects like Economics and International Development and since McGill has a high female to male ratio, they also tend to be women.

I actually happen to be one of the few CS students. From what I can tell this difference is primarily due to the founder effect, as our founders were in more economics like subjects, and due to the different demographic make ups of Berkeley and McGill.

I completely missed this comment, sorry.

I think it's absolutely the founder effect. Sorry I didn't make that clear -- EA Berkeley's demographics are much more a product of the social circles of the most committed members (mostly CS, mostly male, disproportionately Indian), than they are a product of EA's demographics as a whole.

Huh! Does economics at McGill have more women than men?

[anonymous]2
0
0

When I say "being more welcoming", I mean not talking about NP-completeness, video games and popularity of CS professors, and instead talking about topics that everyone could be a part of. This seems like an important change that will directly lead to more retention of members in group B, which in turn seems important to me because it increases the diversity of EA.

Excellent point. I think I agree.

I think this is an instance of "Selecting on the Correlates" which I talked about in my talk at EA Global this year (starts at minute 36). Given the examples you cite, I agree that this exerts a selection pressure against diversity and that this is bad.

Yet, we want to draw some important lines here. Interest in talking about CS professors is not a selection pressure we want to exert. But, interest in talking about EA-relevant topics (even unusual or controversial ones) is a selection pressure we want to exert. It's important to strike the right balance.

I think the issue is that "be welcoming" doesn't seem to be very helpful. To me, it sometimes seems to mean something like "be nice" which I don't think we're failing at. Other times it seems to mean something like "be normal" where that can refer to moderating actions or opinions to sync more closely with mainstream thought which may or may not be a good thing.

I think the "don't select on the correlates" idea makes the point in a more crisp way.

Haven't had a chance to listen to your talk which might clear this up but while "don't select on the correlates" does technically capture Rohin's point, it doesn't really resonate with me as making the point in a more crisp way, especially when contrasted with being welcoming.

I think one of the more insidious features of the type of phenomenon Rohin's talking about is that, from the inside, it doesn't FEEL like you're making a selection at all. Indeed, apparently EA Berkeley's intentional/explicit attempts at selection were basically random - selecting for almost nothing other than altruism. But, despite the lack of explicit selection, there was still a selection effect.

Asking people to do selection differently feels pretty far removed from the actual actions (if any) we might want someone to take if a lot of those people don't by default feel like they're doing selection at all.

Okay, I agree with the "don't select on the correlates" phrasing.

That said, when I hear "be welcoming", and even "be nice", I don't hear "Don't talk about controversial EA-related topics", I hear something more like "Don't talk about CS professors", which I certainly do think we're failing at. (Heck, we couldn't do this at EA Berkeley, which already feels more diverse to me than the general EA community.) I don't know if everyone else means that when they say "be welcoming".

(Some evidence that other people feel this way too -- as of now, 4 people upvoted my previous comment.)

So it sounds like "recruit outside your immediate social network" and "be welcoming" may be equally important? You seem to have had some success with the "recruit outside your immediate social network" part--what has worked for you?

Random unrelated note: one interpretation of "be welcoming" is "suppress weird ideas to avoid scaring away newcomers". I think this approach has benefits, but it also has a few important costs. First, weird ideas will get discussed anyway. If getting classified as a newcomer makes you not privy to those discussions, and you can tell, you won't feel welcomed. Second, weird ideas bind people together. The modern world is a lonely place, and people want meaningful group cohesion. That's why Crossfit is such a hit. Absent exercising really hard together, we should use whatever we've got. Third, weird ideas are an important part of EA, and if someone dislikes weird ideas, that's evidence that they're not a good fit. Therefore, I propose that instead of suppressing weird ideas, we share them with newcomers as though they are being let in on a secret.

Recruit outside your social network: We teach a DeCal (student taught course) about Effective Altruism that is posted on a list of DeCals that all Berkeley students can see, and we play speed Giving Games with random students who are walking down Sproul plaza. The class has been pretty successful; the speed Giving Games not so much (most successes there are with people who already know someone in the club). I'll be posting a retrospective about EA Berkeley soon with more details. TL;DR: This is hard to do.

I would say "be welcoming" is more important -- there's already a small base rate of people outside standard EA social networks that have some interest in EA. Currently I think a very large proportion of them (>80%) end up not becoming a part of EA. (Compared to ~50% for standard EA demographics.) Bringing that number down would be very helpful, and I think is more tractable.

Re: random unrelated note: That makes sense, but I will say that my impression of "be welcoming" is not "suppress weird ideas", I've said more about this in a comment above.

I think the founder effect explanation is definitely a big part of the story in terms of the demographic makeup. However, that does not mean being more welcoming won't help. As a white founder myself, I have learned the hard way that racial diversity in particular must be actively cultivated, and you can get yourself very deep into a hole after a short time because people start to notice the demographic makeup and make judgments and inferences about the community based on it. I was pleased to see more racial diversity at EA Global this year than I expected (although still very few black and Latino participants), but one thing I couldn't help but notice is that there were no non-white speakers at ANY of the sessions I attended over two days. That's the sort of thing that can be perceived as unwelcoming for someone who has made the commitment to attend a conference and has already spent a bunch of time being one of the few people in a room who looks a certain way, and it also shows that it's not just about who expresses interest in EA in the first place.

Selection bias likely makes looking at average NPS unwise. People willing to take flights to go to a conference about X are likely more enthusiastic about EA, and so willing to promote it to others. If the point is merely there is this cohort of ~ 2000 EAs who are very keen, fair enough. Yet this does not provide a huge amount of information about the perception of the 'EA brand' - EA global might have selected the dedicated fanbase out of a much larger pool of the ambivalent or antagonistic.

[anonymous]0
0
0

It's fair to suggest that we don't get carried away with NPS and it's fair to argue that NPS may not represent EA's brand as a whole.

But, for what it's worth, asking EAG attendees about EA doesn't seem like a stronger selection effect than the usual context for this question. NPS is about consumer loyalty. That means someone has to purchase the product before you can ask it.

If you ask someone for their NPS on an Apple Laptop, they have to spend $1K+ on the laptop first. It's not clear that asking this question of people that attended a conference is substantially different.

I think the point is that for NPS, we're interested in what all effective altruists think, since they're the users of the product. But EAG attendees are not likely to be typical effective altruists: they will probably be more committed, and more positive about EA than a typical EA is.

To continue the Apple analogy, it's a bit like basing your NPS score not on everyone that buys a laptop, but on the people who comment most on Apple product forums: these people won't be typical of Apple's consumers.

[anonymous]3
0
0

Out of curiosity, what would people accept as evidence for or against the "EA is unwelcoming" hypothesis?

Some kind of anonymous survey mechanism that managed to capture people who had interacted with EA in a low-to-medium-intensity way (eg, through the Facebook group or one of the websites, through playing a giving game at a campus group, attending one meet up of a campus group, etc) and tracked a) whether they interacted at higher-intensity (eg, applying to EAG) later, and b) whether they internally felt it was welcoming.

My current belief is something like "EA is unwelcoming to people not in the standard EA demographic". So based on that:

Weak evidence:

  1. EA demographics have been moving towards "normal" (eg. less gender bias, more racial diversity, but probably still mostly relatively rich people). A priori, I would expect this to happen anyway, but at a pretty gradual rate, something like 1-5 percentage points per year for gender bias.

Moderate evidence:

  1. Data that shows that people not in the standard EA demographic are just as likely to be committed EAs after a medium/high interaction that involved meeting other EAs in the real world (eg. physical meetups or EA Global).

  2. Data that shows that people not in the standard EA demographic are just as likely to be committed EAs after a low/medium interaction (eg. a Giving Game). A priori, I would expect that they are just as likely to want to learn more, but are less likely to continue on the path after learning more and engaging with the community.

  3. Something similar to the above two, like Ajeya's suggestion.

Strong evidence:

  1. The majority of other local EA group leaders disagree with me.

For this issue specifically, I trust observations made by local group organizers more than I would trust large scale observational/correlational data, just because I can imagine so many different factors at play here that even if the data supported the hypothesis that EA as a whole is welcoming, I would still expect there to be several subfactors where we could and should still improve. (Though I could be convinced that it may not be worthwhile to do so.)

You could also ask these questions of EAG attendees who had relatively little contact with the community before attending.

I don't have a great answer to this and think it's pretty tough to capture with data. Given that, I'd probably go with something like Ajeya's suggestion. Just asking people whether they felt welcome, whether they had any experiences that made them feel unwelcome, whether they plan to continue to engage the community and why, whether the community could have done things to make them feel more welcomed, etc. seems like the best bet.

[anonymous]3
0
0

the team most likely to be growing EA, the Effective Altruism Outreach team

Evidence please.

The "Ways of hearing about EA over time" and "Ways people got involved compared to ways they heard about EA" sections of the last EA survey results and analysis are relevant here.

[anonymous]0
0
0

Unfortunately, it's hard to disentangle personal referrals from the EA survey data since it could be that someone was personally referred to 80K or GiveWell for example. However, the claim is unsubstantiated so I'll change it.

The title of this post sounds a bit cultish. :-)

The title, unfortunately, gives me slightly more negative vibes towards EA. :(

"Lets poll EAG attendees to see how EAs feel about EA" -no statistician ever

The 2015 EA Survey asked questions about welcoming, which may be more representative (though still biased and not truly representative).

We could add NPS to the 2016 Survey.

Im an EA. I've donated over 50% of my income for over 3 years, and I've been active volunteering (informally and formally) for over 8 years. I rarely felt comfortable at an EA event or meetup.

Ive met a handful of people who donate 10%, and a handful of people who do some volunteer work. I've also met a bunch of people I suspect of being more interested in philosophy and socializing than altruism. EA community building is a huge disappointment.

There is huge potential in EAs working together--the sum of the wholes are greater than the sum of the parts. But after 3 years of trying, I'm about ready to give up.

What would an ideal EA event look like to you? Would you like to see more discussion on earning to give and where to donate? Do you feel like earning to give is underappreciated in the EA community?

Just my opinion. I'd like to see more EA's working together. For example, at a couple events there were discussion of helping people pursue higher paying jobs in software development. I met another EA who invests money for EAs at a rate significantly higher than the market without taking fees.

[anonymous]0
0
0

I don't think this use of NPS is substantially different from other uses of it. It's a loyalty question which means that you ask it of people who purchased your product. If Apple asks NPS on a MacBook someone has to spend $1K+ on the laptop first. I don't see how asking people who attend a conference is substantially different.

I'm not claiming NPS is a representation of the strength of the EA brand overall.

Also, while disagreeing with the post is fair game, I don't think the sarcasm is helpful.

Curated and popular this week
Paul Present
 ·  · 28m read
 · 
Note: I am not a malaria expert. This is my best-faith attempt at answering a question that was bothering me, but this field is a large and complex field, and I’ve almost certainly misunderstood something somewhere along the way. Summary While the world made incredible progress in reducing malaria cases from 2000 to 2015, the past 10 years have seen malaria cases stop declining and start rising. I investigated potential reasons behind this increase through reading the existing literature and looking at publicly available data, and I identified three key factors explaining the rise: 1. Population Growth: Africa's population has increased by approximately 75% since 2000. This alone explains most of the increase in absolute case numbers, while cases per capita have remained relatively flat since 2015. 2. Stagnant Funding: After rapid growth starting in 2000, funding for malaria prevention plateaued around 2010. 3. Insecticide Resistance: Mosquitoes have become increasingly resistant to the insecticides used in bednets over the past 20 years. This has made older models of bednets less effective, although they still have some effect. Newer models of bednets developed in response to insecticide resistance are more effective but still not widely deployed.  I very crudely estimate that without any of these factors, there would be 55% fewer malaria cases in the world than what we see today. I think all three of these factors are roughly equally important in explaining the difference.  Alternative explanations like removal of PFAS, climate change, or invasive mosquito species don't appear to be major contributors.  Overall this investigation made me more convinced that bednets are an effective global health intervention.  Introduction In 2015, malaria rates were down, and EAs were celebrating. Giving What We Can posted this incredible gif showing the decrease in malaria cases across Africa since 2000: Giving What We Can said that > The reduction in malaria has be
Ronen Bar
 ·  · 10m read
 · 
"Part one of our challenge is to solve the technical alignment problem, and that’s what everybody focuses on, but part two is: to whose values do you align the system once you’re capable of doing that, and that may turn out to be an even harder problem", Sam Altman, OpenAI CEO (Link).  In this post, I argue that: 1. "To whose values do you align the system" is a critically neglected space I termed “Moral Alignment.” Only a few organizations work for non-humans in this field, with a total budget of 4-5 million USD (not accounting for academic work). The scale of this space couldn’t be any bigger - the intersection between the most revolutionary technology ever and all sentient beings. While tractability remains uncertain, there is some promising positive evidence (See “The Tractability Open Question” section). 2. Given the first point, our movement must attract more resources, talent, and funding to address it. The goal is to value align AI with caring about all sentient beings: humans, animals, and potential future digital minds. In other words, I argue we should invest much more in promoting a sentient-centric AI. The problem What is Moral Alignment? AI alignment focuses on ensuring AI systems act according to human intentions, emphasizing controllability and corrigibility (adaptability to changing human preferences). However, traditional alignment often ignores the ethical implications for all sentient beings. Moral Alignment, as part of the broader AI alignment and AI safety spaces, is a field focused on the values we aim to instill in AI. I argue that our goal should be to ensure AI is a positive force for all sentient beings. Currently, as far as I know, no overarching organization, terms, or community unifies Moral Alignment (MA) as a field with a clear umbrella identity. While specific groups focus individually on animals, humans, or digital minds, such as AI for Animals, which does excellent community-building work around AI and animal welfare while
Max Taylor
 ·  · 9m read
 · 
Many thanks to Constance Li, Rachel Mason, Ronen Bar, Sam Tucker-Davis, and Yip Fai Tse for providing valuable feedback. This post does not necessarily reflect the views of my employer. Artificial General Intelligence (basically, ‘AI that is as good as, or better than, humans at most intellectual tasks’) seems increasingly likely to be developed in the next 5-10 years. As others have written, this has major implications for EA priorities, including animal advocacy, but it’s hard to know how this should shape our strategy. This post sets out a few starting points and I’m really interested in hearing others’ ideas, even if they’re very uncertain and half-baked. Is AGI coming in the next 5-10 years? This is very well covered elsewhere but basically it looks increasingly likely, e.g.: * The Metaculus and Manifold forecasting platforms predict we’ll see AGI in 2030 and 2031, respectively. * The heads of Anthropic and OpenAI think we’ll see it by 2027 and 2035, respectively. * A 2024 survey of AI researchers put a 50% chance of AGI by 2047, but this is 13 years earlier than predicted in the 2023 version of the survey. * These predictions seem feasible given the explosive rate of change we’ve been seeing in computing power available to models, algorithmic efficiencies, and actual model performance (e.g., look at how far Large Language Models and AI image generators have come just in the last three years). * Based on this, organisations (both new ones, like Forethought, and existing ones, like 80,000 Hours) are taking the prospect of near-term AGI increasingly seriously. What could AGI mean for animals? AGI’s implications for animals depend heavily on who controls the AGI models. For example: * AGI might be controlled by a handful of AI companies and/or governments, either in alliance or in competition. * For example, maybe two government-owned companies separately develop AGI then restrict others from developing it. * These actors’ use of AGI might be dr