Over time it was getting less engagement, and I felt that the content made more sense as a substack/newsletter than a forum post - it's not the kind of post that leads to discussions.
It's also not a new thing - The Elitist Philanthropy of So-Called Effective Altruism - from 2013.
I'm not sure you have to do anything with it, generally groups that suggests money/influence should be shifted from A to B will get a negative response from the people it may affect or people who disagree with that direction of change. I tend to find energy spent on ideological EA critics is less valuable than good faith critics/people who are just looking for resources to help themselves to more good.
Depending on what you are aiming to achieve with that section of the website, you don't have to have notable figures, you could include people who are most relevant (or not include individuals at all).
For example Magnify Mentoring has people who have benefited from their mentoring programs. EA Philippines have photos of their local community. EA for Christians have stories from members on their community tab and no profiles of people on their intro page.
Thanks for the shout-out akash, I appreciate it.
With engagement, there might be less comments/likes on substack but it generally gets 1.2k-1.5k views per month compared to the forum which was around 200-400 views per month.
Could the main difference be that TBP is a simple process change with reduced costs, while EA-style giving would fundamentally alter grant evaluation requiring more overhead from the funder.
I also think EA would involve extra costs to existing grantees, they will have to provide more evidence of their effectiveness or lose out to orgs that have those systems in place.
Separately I think it will be very hard to get existing foundations to shift to use more EA frameworks unless their main donors become interested in it. There is probably more to be gained by finding and helping the UHNW people/orgs that are inclined towards EA already.
There is a post about this (although it was written in 2015).
...There are some good reasons for why large donors would want to not give too much money to a charity at once:
- Avoiding excessive reserves: Because of the opportunity costs (other charities could use money productively sooner), it is undesirable to have a charity having excessive reserves. Ideally, they would be promised a steady stream of funding if they meet specific targets over many years in order for them to be able to plan ahead.
- Risk diversification: Funds should be distributed to se
CGD has a different take on this type of migration.
"Between the start of 2021 and 2022, the number of Nigerian-born nurses joining the UK nursing register more than quadrupled, an increase of 2,325. Becker’s human capital theory would suggest that this increase in the potential wages earned by Nigerian-trained nurses should lead to an increase in Nigerians choosing to train as nurses. So what happened? Between late 2021 and 2022, the number of successful national nursing exam candidates increased by 2,982—that is, more than enough to replace those who had ...
Thanks David appreciate the article - I think its a good indication of how complex the question of immigration is and how I don't think its a slamdunk in either direction.
My impression is though that the article is a pretty poorly researched and misleading piece - even though some of its arguments might still stand in many cases despite that.
First its weird that the article makes zero mention of the state of the Nigerian health system, nor how this mass emmigration might be affecting it. Is staffing getting better or worse? Are outcomes getting bette...
I remember the 'subforums' being more like chat rooms in their user design than actual sub forums which you can navigate through from a front page.
It doesn't seem that great an opportunity as they've randomly selected 10,000 people out of 7.5 million adults. It then looks like you have to come to a consensus with the 50 participants otherwise the money goes back to her.
I found the Global Skills Partnerships from CGD interesting but I don't know how active it still is/if you can fund it specifically.
As far as I know they weren't funded by donated money, they received a grant from the S&F Fund and a smaller one from Open Phil (I don't think either org take donations). The rest was self funded, more details in the original post.
I think it depends on how you define 'narrow EA', if you focus on getting 1% of the population to give effectively, that's different to helping 100 people make impactful career switches but both could be defined as narrow in different ways.
One being narrow as it focuses on a small number of people, one being narrow as it spreads a subset of EA ideas.
Taking the Dutch Existential Risk Initiative example, it will be narrow in terms of cause focus but the strategy could still vary between focusing on top academics or a mass media campaign.
'Narrow EA' and having >1% of the population fitting the above description aren't opposite strategies.
Maybe it's similar to someone interested in animal welfare thinking alt protein coordination should focus on scientists, entrepreneurs, funders and policy makers but also thinking it would be good for there to be lots of people interested in veganism.
There are a lot of private sector community roles, some with salaries up to $180k - Here are some examples from a community manager job board.
It's not necessarily that the "EA" jobs are more poorly paid, just that the people that take these roles could realistically earn much more elsewhere.
One way to think about it is that the aim of EA is to benefit the beneficiaries - the poorest people in the world, animals, future beings.
We should choose strategies that help the beneficiaries the most rather than strategies that help people that happen to be interested in EA (unless that also helps the beneficiaries - things like not burning out).
It makes sense to me that we should ask of those who have had the most privilege to give back the most, if you have more money you should give more of it away. If you have a stronger safety net and access to inf...
Looking at the grants database for 2023, there seems to be only 24 projects listed there for a total of ~$204k, which is less than 10% of the money said to be granted in 2023.
Including the 2022 Q4-2 tag, there are now 54 projects with grants totalling $1,170,000 (although this does include some of the examples above). I don't know how many of these grants are included with the total sum given in the original post.
The ten largest grants were:-
I think this has been thought about a few times since EA started.
In 2015 Max Dalton wrote about medical research and said the below.
"GiveWell note that most funders of medical research more generally have large budgets, and claim that ‘It’s reasonable to ask how much value a new funder – even a relatively large one – can add in this context’. Whilst the field of tropical disease research is, as I argued above, more neglected, there are still a number of large foundations, and funding for several diseases is on the scale of hundreds of millions of dol...
There are various posts about volunteering here.
I've linked some below that might be the most relevant.
Volunteering isn't free
Effective Volunteering
What is a good answer for people new to EA that request advice on volunteering?
Why You Should Consider Skilled Volunteering
Also the $70 billion on development assistance for health doesn't include other funding that contributes to development.
The Panorama episode briefly mentioned EA. Peter Singer spoke for a couple of minutes and EA was mainly viewed as charity that would be missing out on money. There seemed to be a lot more interest on the internal discussions within FTX, crypto drama, the politicians, celebrities etc.
Maybe Panorama is an outlier but potentially EA is not that interesting to most people or seemingly too complicated to explain if you only have an hour.
Yeah I was interviewed for a podcast by a canadian station on this topic (cos a canadian hedge fund was very involved). iirc they had 6 episodes but dropped the EA angle because it was too complex.
I've written a bit about this here and think that they would both be better off if they were more distinct.
As AI safety has grown over the last few years there may have been missed growth opportunities from not having a larger separated identity.
I spoke to someone at EAG London 2023 who didn't realise that AI safety would get discussed at EAG until someone suggested they should go after doing an AI safety fellowship. There are probably many examples of people with an interest in emerging tech risks who would have got more involved at an earlier time if the...
I've written about this idea before FTX and think that FTX is a minor influence compared to the increased interest in AI risk.
My original reasoning was that AI safety is a separate field but doesn't really have much movement building work being put into it outside of EA/longtermism/x-risk framed activities.
Another reason why AI takes up a lot of EA space, is that there aren't many other places to go to discuss these topics, which is bad for the growth of AI safety if it's hidden behind donating 10% and going vegan and bad for EA if it gets overcrowde...
"Which is bad for the growth of AI safety if it's hidden behind donating 10% and going vegan"
This may be true and the converse is also possible concurrently, with the growth of giving 10% and going vegan potentially being hidden at times behind AI safety ;)
From an optimistic angle "Big tent EA" and AI safety can be synergistic - much AI safety funding comes from within the EA community. A huge reason those hundreds of millions are available, is because the AI safety cause grew out of and is often melded with founding EA principles, which includes giving wh...
If the definition of being more engaged includes going to EAG and being a member of a group, aren't some of these results a bit circular?
EA isn't a political party but I still think it's an issue if the aims of the keenest members diverges from the original aims of the movement, especially if the barrier to entry to be a member is quite low compared to being in an EA governance position. I would worry that the people who would bother to vote would have much less understanding of what the strategic situation is than the people who are working full time.
Maybe we have had different experiences, I would say that the people who turn up to more events are usually more interested in the social sid...
For better and/or for worse, the membership organization's ability to get stuff done would be heavily constrained by donor receptivity. Taking EA Norway as an example, eirine's comments tell us that (at least as of ~2018-2021), "[t]he total income from the membership fee covers roughly the costs of organising the general assembly," that "board made sure to fundraise enough from private donors for" the ED's salary, but that most "funding came from a community building grant from the Centre for Effective Altruism (CEA)" (which, as I understand it, means Open...
I think one large disadvantage of a membership association is that it will usually consist of the most interested people, or the people most interested in the social aspect of EA. This may not always correlate with the people who could have the most impact, and creates a definitive in and out.
I'd be worried about members voting for activities that benefit them the most rather than the ultimate beneficiaries (global poor, animals, future beings).
A separate organisation just for CBGs would have been useful too rather than a lot of one and two person teams with constant turnover.
I thought about this briefly a few months ago and came up with these ideas.
I didn't vote but there has been discussion of issues in richer countries that received votes but the author pointed out how it fit into the context of effective altruism.
There have also been posts about mass media interventions but they generally refer to stronger evidence for their effectiveness.
Thanks for diving into the data David, I think a lot of this might hinge on the 'highly engaged EAs' metric and how useful that is for determining impact vs how much someone has an interest in EA.
Are you also able to see if there are differences between different types of local groups (National/City/University/interest)?
I would go further and say that more people are interested in specific areas like AI safety and biosecurity than the general framing of x-risks. Especially senior professionals that have worked in AI/bio careers.
There is value for some people to be working on x-risk prioritisation but that would be a much smaller subset than the eventual sizes of the cause specific fields.
You mention this in your counterarguments but I think that it should be emphasised more.
When I started community building I would see the 20 people who turned up most regularly or had regular conversations with and I would focus on how I could help them improve their impact, often in relatively small ways.
Over time I realised that some of the people that were potentially having the biggest impact weren't turning up to events regularly, maybe we just had one conversation in four years, but they were able to shift into more impactful careers. Partially because there were many more people who I had 1 chat with than there were people I had 5 chat...
I guess the overlap is quite high for myself between 'impact' and 'impact as a community builder'.
Thanks for writing this post, I've been thinking about this framing recently. Although more because I felt like I was member-first when I started community building and now I am much more cause-first when I'm thinking about how to have the most impact.
I don't agree with some of the categorisations in the table and think there are quite a few that don't fall on the cause/member axis. For example you could have member first outreach that is highly deferential (GiveWell suggestions) and cause-first outreach that brings together very different people that disa...
I think the BOTEC is conflating being aware of EA with being an 'EA'.
Also most people are usually optimising for other factors when choosing where to live so the number on the table is much less.
I meant the communities/organisations that have overlap with EA but focused on a specific cause, but it would be useful to connect people to less EA related orgs like the Nuclear Threat Initiative, CEPI, etc.
It seems like there is less field building for existential risk but also not that much within specific causes compared with the amount of EA specific field building there has been.
This seems to be changing though with things like the Summit on Existential Security this year, and updates being made by people at EA organisations (mentioned by @trevor1 in...
Earlier in the post - 'We also sent out a survey to the Foresight community, which generated 41 responses from participants in our technical groups'
41 out of ~1800 seems like an extremely low response rate – one would usually expect ~10% response rate, from what I've heard. Combining that with the singular female respondent, it seems to me that this survey is not particularly representative of their "STEM community".
In my example I was more referring to orgs like EVF, but I imagine if EA was more centralised there would be a range of larger orgs, some more like EVF and others more like Open Phil, who aren't incubating projects.
It seems that there would be more to be gained from building bridges between the STEM and existential risk communities rather than EA more broadly.
EA has a lot of seemingly disconnected ideas that aren't as relevant to most people. Some will be interested in all of them, but most people will be interested in just a subset. Also with x-risk, some people will have much more interest in one of nuclear/AI/bio risks than all of them.
I think it would be better to have 20 organisations with about 50 people each than 3 organisations with 50 people and then everyone else working as individuals. One organisation with 1000 people would probably be the worst option.
That doesn't seem to match with EA being a front cover story last year, and being shown in a positive light.
I might have missed this but can you say how many people took the survey, and how many people filled out the FTX section?
For UK, data from CAF.
"The proportion of donations going to overseas aid and disaster relief (7% -£931m) halved from a high in 2022 (14%)"