Hey EA Sweden team, really interesting post. Quick question: is there a link to your full budget somewhere? I am a bit unsure if $65k is like 50% or 5% of your current spend, and it's pretty hard to get a sense of cost effectiveness without knowing what the total expected spend is.
I have a couple of guesses:
Every year, we generally become more well-known, particularly within the communities we consistently reach out to (e.g., EA, animal welfare, etc.). This creates natural momentum and credibility within those communities.
Our previous outreach efforts build up the applicant pool for the current year (e.g., someone who heard about us from a talk two years ago might only apply today).
We have done a lot more active outreach to non-EA communities. I think these communities are particularly affected by the visible success of our graduate...
Thanks!
To me, one of the major stories here is that you have managed to break free of an important limiting factor, at least for the present. That is worth celebrating and learning from.
To quickly chip in with some data I have, people who were pretty happy with the program, scoring it 4.45/5. About half of them received a job offer, placement, or internship following the program, most of which were facilitated or recommended by us. We have not done a full counterfactual estimate yet, though, as I do think talented people often get offers anyway, even without the extra skills/credentials that the program provides. So, it might be counterfactually closer to 33%.
Hey Silas, really glad you wrote this up. I also recently donated bone marrow (after donating blood many times and being a bit torn on kidney donations). My experience was equally positive and probably even easier logistically (from London, UK).
Some hard-nosed calculations for those who might be interested (that I will write up in a full post one day): I lost about 1 full day of work and would expect the average person to lose between 1-3 days of work if they wanted to lose as few workdays as possible. My best estimate is this saved between 4-12 years of l...
Just wanted to chip in that I am quite positive about this choice and the direction that CEA could go in under Zach's leadership. I have found Zach to be thoughtful about a range of EA topics and to hold to core principles in a way that is both rare and highly valuable for the leader of a meta organization.
Do you know anything about the strategic vision that Zach has for CEA? Or is this just meant to be a positive endorsement of Zach's character/judgment?
(Both are useful; just want to make sure that the distinction between them is clear).
I think the model I would suggest is indeed close to what Joel is saying - try it out system as opposed to guessing a priori how you will be affected by things. More specifically, track your work hours/productivity (if you think that is where the bulk of your impact is coming from) and see if, for example, donating blood on the weekend negatively, positively, or has no effect on them. I think that my output has gotten higher over time, in part, due to pretty active testing and higher amounts of measurement. - Related post
I do tend to think that most people's limiting factor is energy instead of time. E.g. it is rare to see someone work till they literally run out of hours on a project vs needing a break due to feeling tired. Even people working 12 hours a day, I still expect they run out of energy before time, at least long term. I would not typically see emotional energy as my limiting factor, but I do think it's basically always energy (a variable typically positively affected by altruism in other areas) vs. time or money (typically negatively affected).
energy (a variable typically positively affected by altruism in other areas)
This assumption seems totally out of left field to me. I agree altruism can increase energy, but in many other cases it uses it up.
I echo this view and think it's really exciting. I expect many people in the meta-funding space will be positive about this idea. However, I also anticipate that many of the donors will need to see a round or two of this idea executed and observe the resulting grants before donating to the fund.
As shown in this table 0% of CE staff (including me) identify AI as their top cause area. I think across the team people's reasons are varied but cluster around something close to epistemic scepticism. My personal perspective is also in line with that.
Hey Yanni!
Quick response from CE here as we have some insight on this:
a) CE is not funding-limited and does not think AI is an area we will work on in the future, regardless of available funding in the space (we have been offered funding for this many times in the past). You can see a little bit about our cause prioritization here and here.
b) There are tons of organizations that aim or have aimed to do this, including Rethink Priorities, Impact Academy, Center for Effective Altruism and the Longtermist Entrepreneur...
You've given lots of reasons here, and cited posts which also give several reasons. However, I feel like this hasn't stated the real & genuine crux - which is that you are sceptical that AI safety is an important area to work on.
Would you agree this is a fair summary of your perspective?
About 75% of seed project proposals get funded at the amount they ask for. That part is not known until after the incubation process. The typical seed grants are between $100k-$200k. I do not expect a great proposal to be stopped by a $25k higher budget. I think entrepreneurship is a higher-risk career path, one that is probably not suited for the majority of people. CE is already extremely de-risked relative to equivalents in the for-profit and incubated nonprofit space, to the point where I think the founding step is not the highest-risk part of founding a charity (having an impact 3 years down the line is).
It might be helpful to add some useful reference classes here as I think it's often forgotten how unusual EA salaries are relative to other fields.
Average GDP of the world: £11,000
London's living wage: £21,800
Median full-time UK employees: £26,800
Average salary nonprofit jobs: £31,700
The average annual salary in London: £39,000
Average salary nonprofit London: £39,600
Average CE employee salary: £39,300
Entry-level EA job: £48,000
Average EA job: £80,000
Thank you for posting this, Joey. I think people too often talk about things like this in the abstract, not knowing the realities of the market. Two considerations I have when considering salary:
"My default interpretation is that someone doesn't value the role or my work very much."
I think this is a pretty unfortunate norm that some EAs have. In practice, it results in EAs by far prioritizing the best-funded areas instead of the most impactful ones. I think the reality is that offered salaries have far more to do with funding availability and perceived counterfactuals of funding. At the end of the day, AMF can absorb more money, and thus there is a higher bar for spending in global health than there is in areas without clear benchmarks.
Salary questions and discussions always happen well before someone goes through the program (typically during the interviews or soon after an invitation is offered). Ultimately, the co-founders select how much they ask for, and many have asked for considerably higher amounts.
I think this is a pretty simple and incorrect model. Job desirability is considered based on many traits (salary being one, but far from the only one), with different individuals weighing those traits at different levels of importance. If salary is the most important factor for a job, CE will basically never compete given the talent requirements; if autonomy or impact is the most important trait, it will compete even at very low levels of salary.
I am pretty skeptical that this would be the best way to increase diversity in EA per dollar. I talk to quite a number of incubation programs, both inside and outside of EA, and for most forms of diversity, I do not think that a low salary is the top barrier. I think that for age diversity, there is a case for this, but for country-level diversity, it might pull in the opposite direction. My soft sense is that both CE staff and the CE cohort are unusually diverse relative to the EA movement as a whole and similar EA incubation programs, despite having low salaries.
Also, a factual update: I think most numbers right now for founders are more in the 40-60k range.
I think it holds up. I wrote a highly upvoted post on organisations being transparent about their scope one month ago due to similar concerns.
2021 is outdated when discussing budgets and projects we are not currently working on blog posts. With regards to the budget, I believe this is the legal minimum that has to be made public, though much of the data is combined within an EVF, making it harder to pull out specific details. I think the budget is inherently tied to the scope, as it's challenging to truly understand where an organisation is allocating its resources without this kind of basic information. For instance, if an organisation spends a large percentage of its resources on a certain area, any cause preference in that area would have a much greater impact on the overall scope of the organisation.
"I think beyond this level of transparency, CEA is probably hitting rapidly diminishing returns or taking attention away from more important topics." I would be surprised if this was the case as some pretty basic stuff is missing, e.g. I could not find a recent public budget for CEA.
Hey Rob,
I wonder if filling out something like the template I laid out in this post could allow transparency without disclosing confidential details for the CEA group's team.
Hey Anon, indeed, the categorisation is not aimed at the target audience. It’s more aimed at the number and requires specific ethical and epistemic assumptions. I think another way to dive into things would be to consider how broad vs. narrow a given suggested career trajectory is, as something like CE or Effective Altruism might be broad cause area-wise but narrow in terms of career category.
However, even in this sort of case, I think there is a way to frame things into a more answer vs. question-based framework. For example, one might ask something like:...
What concrete actions might this suggest?
I think the most salient two are connected to the other two posts I made. I think people should have a transparent scope, especially organizations where people might be surprised about their current focus and they should not use polarizing techniques. I think there are tons of further steps that could be taken; a conference for EA global health and development seems like a pretty obvious example of something that is missing in EA.
Hey, I think this is a pretty tricky thing to contemplate, partly due to organizations not being as transparent about their scope as would be ideal. However, I will try to describe why I view this as a pretty large difference. I will keep the 80k as an example.
1) Tier-based vs. prioritized order
So indeed, although both organizations list a number of cause areas, I think the way CE does it is more in tiers, e.g., there is not a suggested ranking that would encourage someone to lean towards health and development policy over family planning. On the other han...
Indeed, I think those points in the right direction, and this post by 80k stands out as one of the most clear examples of things I would like to see more of. For example, you can gather from this 80k post that ~20% of effort goes to all areas outside of Xrisk/EA meta and I think this would be quite surprising for many people in the EA community to know. However, I still think this information is not well-known or internalised by the broader community.
Your ordering of who has struggled with this in the past matches my sense, although I would add that I think it's particularly important for community building, meta, and EA leadership organizations. These are both the least naturally definitionally clear in what they do and have the most engagement with a counterfactually sensitive community.
On CEA, I think a chart like the one I outlined in the post would be super useful, as I pretty constantly hear very different perspectives on what CEA does or does not do. I understand this might change with, e.g., a new ED, but even understanding 'what CEA was like in 2022' I think could provide a lot of value.
This is close to what I am saying, but I might phrase it stronger. For example, a large donor may consistently be a potential fit for your field, but I still believe it's important to be considerate about how far you push them. Similarly, a highly talented individual might require more than just signposting; they also should not be perceived as second-class or unintelligent for having a different viewpoint.
Short responses here:
Why are more people a fit for for-profit? I think for-profits require many of the same skills but far less focus on impact (even if the founder of that for-profit aims to donate their earnings). I think the M/E and fundraising requirements of an NGO are harder than the equivalent in the for-profit space.
Why 20%? It's not a deeply considered number. I can easily imagine it being 10%, although the stats I saw suggested the current population was somewhere between those two numbers. I think one could argue both that too many people are do...
Indeed, I think having a separate non-GiveWell run global health fund would be really great. Most of the members of our seed network donate between 10k-100k, so that is also an option for folks in that range. But I do wish there were more cause-specific ones.
We have a talk on the key traits of great charity founders here, right now our vetting system is about 0.7 correlated with charity outcomes two years later.
Hey Spencer, thanks! I'm glad the post was helpful. I think these sorts of questions are both common and pretty universally so, I'm happy to be encouraged to write up a bit more description on how CE thinks about it.
Not a ton of writing I love on the topic, but this book is one of the better ones I have read on it: https://www.amazon.com/Talent-Identify-Energizers-Creatives-Winners/dp/1250275814. We will also be publishing our foundation handbook in approximately 3 months, and that has a pretty large section on vetting.
I think the entire ecosystem is important, but my sense is that mid stage is lacking most right now. I feel quite confident on seed funding, and have mixed confidence on late-stage funding, depending on the cause area.
Hey Vasco,
Love the post; I think it is super valuable to have these sorts of important conversations, directly thinking about cross-cause comparison. It’s worth noting that CE does consider cross-cause effects in all the interventions we consider/recommend, including possible animal effects and WAS effects. Despite this, CE does not come to the same conclusion as this post; here are a couple of notes on why:
Strength of evidence discounting: CEAs are not all equal when they are based on very different strengths of evidence, and I think we weight this factor...
We have thought about this but we are not confident weaker charities would not crowd out stronger ones with funders and thus lead to less overall impact.
I think tautological measurement is a real concern for basically every meta charity, although I'm not sure I agree with your solution. I think the better solution is external evaluation, someone like GiveWell or Founders Pledge who does not have any reason to value CE charities. Typically, these organizations do their own independent research and compare it across their current portfolio of projects. If CE can, for example, fairly consistently incubate charities that GW/FP/etc. rank as best in the world, I think that is at least not organizationally tautol...
Hey Nescio,
Sadly, my circumstances have changed such that this was no longer possible without significant work-productivity trade-offs. Specifically, I moved to London, UK (due to work) and have only intermittently been living with a partner. I now am living off a range between £20k-£30k depending on year. I still have the view that a higher salary would not significantly increase my productivity beyond that and have, if anything, more concerns about the current spending habits of EA for reasons described pretty well here.
Hey Vlad,
I would definitely expect some of those 1000 ideas to have been researched by Open Philanthropy or Rethink; a long list like that would include both researched and un-researched areas. I think new nonprofits often come at things with a different angle, e.g., ways of weighting evidence, or tweaks in ethical views or baseline assumptions. For example, GiveWell is both highly well-run and huge, but they would not come to the same considerations that HLI has come to by looking at subjective well-being. I think the same thing will happen with CEARCH; there are lots of areas that might be missed by other actors but that would be picked up by a more systematic search done at a lower level of depth per area.
Currently: Currently we have a backend CEA that evaluates the possible scenarios and impact outcomes for each of the charities. It starts out with pretty wide confidence intervals but tends to narrow as the charities get older (e.g., 2nd or 3rd year). We also write up more narrative reviews that go to a set of external advisors.
Long term plan: Longer term we want to hire an external evaluation organization to evaluate every charity we found two years after founding, and use those numbers instead of internal ones.
Compared to other movements it seems pretty good; relative to the ideal, we of course could do better. In general, I think encouraging more critical thinking and debate is likely a step in the right direction. Right now I think disagreements can be handled a bit indirectly (e.g., I would love to see even more open cause area debates instead of just funding of outreach in one area and not another).
Our policy regarding salaries has not changed as much as other meta charities; leanness tends to attract a different sort of applicant. We have a range ($40-$60k) but would consider applications from candidates who need higher than that range. In practice, we have often found the most talented candidates are less concerned with salary and more concerned about other factors (impact of the role, culture, flexibility, etc.). We are a bit skeptical about the perception that talent increases from offering higher salaries (instead of attracting new talent, we typically see the same EA people getting job roles but just for a higher cost).
This in many ways is the default path for how many NGOs grow. I think there are quite a few reasons why CE overperforms relative to this. Decentralization broadens the risk profile that each charity is able to take, and smaller organizations move far, far quicker. I suspect the biggest factor though, is not structural but social. The level of founders we get applying are really strong relative to an organization like CE hiring program directors. Due to the psychology of ownership they work far more effectively for their project than they would as an employee of a larger organization.
I think something talking about the concept of cause X , or an area we think is a top contender that many EAs have not yet considered deeply (e.g., family planning). Even with the recent challenge prize on this, I think EA is way over-indexed on exploit vs. explore when it comes to cause areas.
I think there are a few things that fit into this category, how much deference is in the EA space would be one. Another would be the relative importance of high-absorbency career paths. Some things we have not written about but also fit would be how EA deals with low evidence base/feedback loop spaces. Or how little skepticism is applied to EA meta charities.
We try to keep a page with information (including room for funding numbers) for the organisations that get founded through Charity Entrepreneurship. Many of them are in a situation where marginal, small donors could make an impact.
First a meta note less directly connected to the response:
Our funding circles fund a lot of different groups, and there is no joint pot, so it's closer to a moderated discussion about a given cause area than CE/AIM making granting calls. We are not looking for people to donate to us or our charities, and as far as I understand, OpenPhil and AWF do not have a participatory way to get involved other than just donating to their joint pot directly. This request is more aimed at people who want to put in significant personal time to making decisions independent... (read more)
- It would be helpful if you engaged with the plagiarism claims, because it is concerning that CE is running researcher training programs while failing to handle that well. I agree with the rest of what you say here as being tricky, but think that it is pretty bad that you publish the low confidence research publicly, and it's led to confusion in the animal space.
- + 2.5 - I think if your ordering is significantly different, it's probably fairly different than most people in the space, so that's somewhat surprising/an indicator that lots of feedback isn't reac
... (read more)