Hide table of contents

How concerned should we be about replaceability? One reason some people don't seem that concerned is that the leaders of EA organizations reported very high estimates for the value of their new hires. About twenty-five organizations answered the following question:

For a typical recent Senior/Junior hire, how much financial compensation would you need to receive today, to make you indifferent about that person having to stop working for you or anyone for the next 3 years?

The same survey showed that organizations reported feeling more talent constrained than funding constrained.

On a scale of 0 to 4, respondents saw themselves as 2.8 constrained by talent and 1.5 by funding, similar to last year and consistent with the donation trade-off figures.

The 2019 survey replicated the results on talent vs funding constraints. It also had useful information on which skills organizations felt were in demand.

Replaceability

On the other hand, After one year of applying for EA jobs: It is really, really hard to get hired by an EA organization is one of the highest upvoted threads on the EA forum. The applicant had a very strong resume and applied for twenty positions. He was rejected by all of them. He lists all the positions he applied for. The author claims his situation is not unusual:

I know several people who fulfil all of the following criteria:
- They studied/are studying at postgraduate level at a highly competitive university (like Oxford) or in a highly competitive subject (like medical school)
- They are within the top 5% of their course
- They have impressive extracurricular activities (like leading a local EA chapter, having organised successful big events, peer-reviewed publications while studying, …)
- They are very motivated and EA aligned
- They applied for at least 5 positions in the EA community and got rejected in 100% of the cases.

He had gotten encouragement from some well informed EA leaders. Said leaders seemed pretty surprised at how badly his job search went. The comments, in general, are very informative. Several people give full details of their EA job searches. However, I will quote from one comment by the author:

People working at EA organisations, sometimes in senior positions, were surprised when they heard I didn't get an offer (from another organisation). I'd guess around half the organisations I applied to were "surprised about the very strong field of applicants". Past messaging about talent constraints probably also plays a role. As a result, career advice in the EA community can be overly optimistic, to a point where more than one person seriously encouraged me to apply for the COO position at OpenPhil (a position which went to the person who led the operations for Hillary Clinton's election campaign(!)). At least a year ago, when I was talking to dozens of people for career advice, I got the impression that it should be comparatively easy to get hired by an EA organisation.

I assume EA leaderships have become more conservative in the advice they give since the thread was very commonly read. But I don't think the fundamentals of the EA job market have changed very much.

It is somewhat surprising the EA job market is so competitive. The community is not terribly large. Here is an estimate:

This suggests that there are roughly 2000-3000 highly engaged EAs in total.
Likewise the estimated size of the EA community more broadly, worldwide, is only 5000-10,000 or about the size of a small town or university.

This suggests to me a very large fraction of highly engaged EAs are interested in direct work.

Two Perspectives

One point of view is roughly the following: It is extremely important to get the best person for the job. Organizations highly value their recent hires but seemingly lack the capacity to effectively ramp up hiring. Performance in many positions varies a huge amount. Therefore if there is even a chance you are the best fit for a job you should apply. Organizations are slow to fire bad performers. People are bad at judging their aptitude. If we discourage applicants, there will be very strong applicants too humble to apply. If people build career capital in hopes of getting an EA job they are unlikely to be hurt even if they never do direct work. The expected value of searching over a larger set of applicants is high.

A different point of view is that it looks like a huge percentage of engaged EAs want to do direct work. Engaged EAs as a group are quite talented. Unless you are an unusually good fit for a direct work job that job should, and probably will, go to someone else. There are some direct work jobs that require genuinely rare skills. But if the job seems like it could be done by an average Princeton grad, then it will probably have qualified applicants and you are unlikely to be the best fit. The risk of discouraging people is real but there are real costs to giving people the wrong impression of their prospects. People can feel betrayed and lied to. Engaged EAs might become discouraged or decide they cannot trust the EA information ecosystem. The early years of a person’s career can be very impactful. It is unwise to encourage people to plan for careers that probably won’t work out.

Ideas

Let us imagine the second perspective is correct and think of ideas of what to do. Of course, you can still do direct work if there is a good fit for your specific abilities and experiences. You can also look for career capital you are especially suited to build quickly (for example trying to pivot into AI safety). But other effective altruists may think of the same plan.

One scalable option is clearly earning to give. If you earn to give and donate X dollars per year you basically increase the total amount donated to effective charities by X dollars. The marginal value of EA donations does decrease as more total dollars are donated. But we seem pretty far from exhausting opportunities for effective giving. Overall earning to give does not suffer much from replaceability issues.

Another option is to be willing to make choices other effective altruists seem unwilling to make. For reasons I do not fully understand it seems like few EAs want to try starting an organization even though many (perhaps most) want to work at one. Having more organizations seems useful to me. It would add healthy competition and there is definitely the talent pool. Perhaps potential founders think they cannot get funding? On net trying to found an org, conditional on having a strong team and vision, seems like a plausibly high EV idea. Founding an org probably decreases the odds of other orgs being funded so there are some replaceability concerns.

One could also do EA work for low or potentially no wages. Starting an org with no seed funding would effectively require doing this for some amount of time. Any EA org that is hiring constrained should not be offering very low wages. But perhaps there are ways to produce effective research for no compensation. Doing this with no structure might be an ineffective use of human capital. Organizing volunteers can be challenging but perhaps volunteer-run EA orgs could be created?

You could consider taking on legal risks. Various forms of non-violent direct action might be an effective way to get high-impact political changes. This view seems most common in the animal rights/welfare community. The number of people willing to take legal risks is quite low so replaceability concerns are negligible.

There are some ways to 'donate' things besides money. Various organ donation charities have been given some EA funding. Donating a kidney, or a portion of a liver, is not replaceable. Various personal forms of mentorship are probably not replaceable either. However, it's hard to imagine how one could make anything like a career out of these opportunities. But it would be valuable to keep them in mind.

This post also appeared here: Replaceability Concerns and Possible Responses

34

0
0

Reactions

0
0

More posts like this

Comments7
Sorted by Click to highlight new comments since: Today at 10:14 AM

It is somewhat surprising the EA job market is so competitive. The community is not terribly large. Here is an estimate...This suggests to me a very large fraction of highly engaged EAs are interested in direct work.

We have data from our careers post which addresses this. 688 (36.6% of respondents to that question) indicated that they wanted to pursue a career in an EA non-profit. That said, this was a multi-select question so people could select this alongside other options. Also 353 people reported having applied to an EA org for a job. There were 207 people who indicated they currently work at an EA org which, if speculatively we take that as a rough proxy for current positions, suggests a large mismatch between people seeking positions and total positions. 

Of those who included EA org work within their career paths and were not already employed in an EA org, 29% identified as "highly engaged" (defined with examples such as having worked in an EA org or leading a local group). A further 32% identified with the next highest level of engagement, which includes things like "attending an EA Global conference, applying for career coaching, or organizing an EA meetup." Those who reported applying for an EA org job were yet more highly engaged: 37.5% "highly engaged" and 36.4% the next highest level of engagement.

One thing you don't mention in your article are the many existing organisations that aren't labelled as "Effective Altruist" but could take thousands of talented new staff each year.

International development charities, governments around the world, academia, information security organisations, specialising in improving relations between the West and countries with newly powerful economies... there are a lot of places where people can make a difference!

I believe this only applies for certain causes, mainly global poverty. If you want to work on existential risk, movement building, or cause prioritization, basically no organizations are working on these except for EA or EA-adjacent orgs. Many non-EA orgs do cause prioritization, but they generally have a much more limited range of what causes they're willing to consider. Animal advocacy is more of a middle ground, I believe EAs make up somewhere between 10% and 50% of all factory farming focused animal advocates.

(This is just my impression, not backed up by any data.)

I think most x-risk organisations aren't explicitly EA, right? Like if you're interested in AI, you might work at OpenAI or DeepMind; nuclear safety you might work in the International Nuclear Safety Group; bioweapons you might work at the Implementation Support Unit for the Bioweapons Convention at the UN.

We could certainly use a lot more EA x-risk people in governmental and intergovernmental bodies (easily a thousand), as well as some in think tanks and academia.

I don't know much about cause prioritisation; I thought that a lot of them were in academia. I agree that EA movement building by definition will only involve working for and with EAs!

Why do you think orgs labelled 'effective altruist' get so much talent applying but those orgs don't? How big do you think the difference is? I am somewhat informed about the job market in Animal Advocacy. It does not seem nearly as competitive as the EA market. But I am not sure the magnitude of the difference in the replaceability analysis.

I think organisations labelled 'Effective Altruist' are more prestigious amongst our friends. People like to work places that are widely recognised and hard to get in to, don't they? I'm not sure how many applicants these other organisations receive, though.

Great post! Been meaning to comment for a while - better late than never than suppose.

One thing I wanted to add - I've talked with ~50 people who are interested in working at EA orgs over the last six months or so, and it seems like a lot of them come to the decision through process of elimination. Common trends I see:

  • They don't feel well-suited for policy, often because it's too bureaucratic or requires a high level of social skills.
  • They don't feel well-suited for academia, usually because they have less-than-stellar marks or dislike the expected output or bureaucracy of academia.
  • And they aren't interested in earning-to-give, almost always because of a lack of cultural fit. (They want to have colleagues who are also motivated to do good in the world.)

Per 80,000 Hours recommended career paths, that pretty much leaves working at effective nonprofits as the only option. And conveniently, nonprofit work (especially non-research roles) doesn't usually come with a high bar of qualifications. A lot of positions don't require a bachelor's degree. Depending on the role, it's not uncommon to find a year of vaguely-defined experience as the only minimum qualification for an entry-level job. So that seems like a reasonable choice for a lot of people... except that hundreds of other EAs also see this as a reasonable choice, and the competition grows very quickly.

I've certainly met EAs who seem really well-suited for direct work at EA orgs. But, in part because of the reasons mentioned above, I think the majority of people would be better off focusing their jobseeking efforts somewhere else. I do worry about swinging the pendulum too far in the opposite direction, where talented people stop applying for EA organizations.

I guess my recommendation for people interested in direct work would be to apply to EA organizations that interest you and that you think fit your skillset, but, at the same time, to also apply for EA-aligned organizations and/or impactful non-EA jobs where replaceability is likely to be lower. I also think, if you're uncertain about whether to apply for or accept an EA direct work role, you can usually talk to the hiring manager about what they feel like your counterfactual impact might be. The nice thing about applying EA orgs is that they understand those concerns, and it likely won't negatively affect your application - in fact, it might reflect positively on you for thinking critically and altruistically (for lack of a better word) about your career.

Curated and popular this week
Relevant opportunities