By ladder climbing, I mean the ability to succeed and increase in rank in well-defined existing bureaucracies and prestige ladders, particularly in non-EA orgs and institutions of particular interest to EAs. Potential examples include large tech companies like Amazon or Google, government bureaucracies like the CIA or WHO, party politics like the US Congress, and academia.
Suppose somebody is fairly talented in an EA-important area, and would be a decent but not stellar fit for work within movement-EA organizations. Suppose they are approximately average at ladder-climbing (relative to other people of similar general competence). Suppose further that they aren’t very entrepreneurial and have no plans/ability/stability to acquire that ability/motivation (so making their own company/nonprofit is right out).
Given that they can’t start their own thing, should we in general advise them to work in an existing EA org, or is it better to recommend people seek out existing ladder climbing opportunities elsewhere?
Of course, details often matter and this answer may well depend on which organizations and prestige ladders you’re interested in climbing in particular. Please feel free to elaborate more in your answer.
Note that I’m only interested in people who are decent but not great fits for EA orgs. People who are stellar fits for core EA orgs should probably work there, and people who cannot work in EA orgs shouldn’t work in EA orgs.
I included some of my own thoughts below. Feel free to answer without being biased by my possibly dumb ideas however!
Some preliminary thoughts
The case against:
I think in most cases, extraordinary impact from doing good outside of standard EA career paths looks like one of:
a) being unusually good at climbing existing prestige ladders, such that you can rise to the top.
b) being unusually self-directed and motivated, with good judgement (which I shortened to “entrepreneurial” but I realize is not the same thing), or
c) both
Unusually good at climbing prestige ladders
The first case is someone who can win (or has a sufficiently high probability of winning) prestige ladders despite going up against other people good at this and aiming for the same thing. Note however it's not always ex ante clear what makes someone good at winning these prestige ladders, so there’s a general argument that some people may find it valuable to “test” their skills/fit for this (H/T Michael Sadowsky).
Hypothetical examples of this would include people who eg, have a clear shot of becoming a congressperson, a top professor, the head of an important US/UK/France/Russia/China/Canada agency/department, or a director of research at a top ML lab, without being too distracted at the top by local incentives and near-term attention grabbers.
Once they win these prestige ladders, they can do useful things like propose bills other EAs suggest that are good for EA causes, publish papers arguing for AI safety, etc.
I’m not personally aware of real-world examples here.
Unusually entrepreneurial
The second case is someone trying to pull the rope sideways, who has enough clarity, vision and general gumption to spot opportunities when they arise and act on it. Hypothetical examples of b):
- someone working in a large BSL IV lab on gain of function research realized the safety standards at the lab were unexpectedly lax, and then dedicated spare time to (successfully) lobby internally for higher safety standards
- Something similar but in a corporate ML lab working towards AGI
- Like the above examples but willing (and able) to be a whistleblower if internal fixes won’t happen
- Somebody working in pig breeding/genetics discovered a novel technique that can improve affective welfare at low costs, and then internally and externally campaigned to make this the new norm
- Someone working in a top (bio)tech company and starting a large and successful AI(bio) safety reading group there
- Somebody working in a Fortune 500 company and successfully lobbying to switch most corporate philanthropy to EA purposes
- A congressional staffer who realized that the bill that came to her congressperson’s desk is more important than anything she has ever seen before, and then dedicated all her spare time and political capital to make that bill as good as possible.
- etc
A real world example is the FDA economist described here, who was asked to evaluate the trans fats ban, crunched the numbers, realized it was orders of magnitude more important than anything he did before and then proceeded to devote all his energy to making this happen. This is probably noticeably better than either a) donations or b) the impact of the majority of other economists (at least in the health/development space).
Another real world example is Tara Mac Aulay’s work in pharma, especially in Bhutan.
Another real world example is the writers of Compassion, by the Pound (though the authors probably do not identify as EA?). Despite the authors not being unusually prestigious by the standards of economics academia, I think the book was plausibly very valuable for EA purposes because a careful treatment of animal welfare by economists just wasn’t done before.
Unusually good at ladder climbing and unusually entrepreneurial
c) is a combination of unusually high ladder-climbing ability with self-direction, motivation, and judgement, such that one could not only rise through the ranks but also have an unusually large impact once they are at the top. A real world example of c) is Jason Matheny’s work as the (eventual) head of IARPA, advancing the fields of forecasting, biosecurity, and AI policy, among other things.
Notably, these hypothetical and real world examples look like relatively unusual motivations and skillsets, which by stipulation do not apply to EA candidates for career advice here.
The case for climbing ladders outside of EA
I think there are a number of other reasons for climbing ladders outside of EA, even if they naively have lower EV than one of the three paths I illustrated above:
- Serving as sense organs for the EA community.
- It’d be really good for the EA community to have a sense of “what’s going on” in various fields of interest that are not current top picks like AI or biosecurity (eg it’d be good to have a few people in the know for the cutting edge of atomically precise manufacturing, anti-aging, nuclear security, human genetic enhancement, maybe artificial wombs, maybe space, etc, as well as the bureaucracies of various countries).
- (This to me is the most plausible case in general)
- Paving the way for other EAs to climb career ladders
- For example, if you collect notes on how to climb ladders in tech or in the US legislature, it might become helpful for other aspiring congresspeople, even if you have no shot yourself
- Paving the way for other EAs to be intrapreneurial within your org
- For example, attending EA meetings and generally being supportive of other EAs may be valuable for internal intrapreneurship efforts even if you don’t have the skillset or motivation to lead such efforts yourself
- Being broadly welcoming across a larger segment of the population, boosting recruitment efforts?
- The argument here is a combination of
- if more people know more EAs, more people would be interested in EA and be willing to contribute to EA efforts
- If only EAs who can’t get EA jobs (or won’t because they dislike EA orgs/EA jobs) are visible outside the movement, this creates pretty strong negative selection pressures on who sees us and may make EA look less interesting.
- I find the argument plausible but I think a strong counterargument is that it seems unlikely that the best way to do EA recruiting/movement-building is doing something other than devoting your time to recruitment/movement-building
- The argument here is a combination of
- For non-EA orgs with sufficiently high benevolence and intelligence, increasing their power by expanding their talent pool.
- I never seriously considered this argument before but it seems plausible-ish?
- Intuitively seems surprising that this would be a compelling theory of change for most individuals competitive with their other options however.
- Career capital
- The argument here is that climbing at least some ladders can be useful for getting valuable skillsets that can later be leveraged to doing EA work.
- For example, getting a PhD in a wet lab to do bio work, or working in ML at a top tech company to prepare for AI safety, or working in the WHO or a startup for ops experience, etc.
- I think a strong counterargument here is again, the most valuable way to prepare for a job is by doing it
- (Knowing almost nothing about the field), I'd personally be surprised if, eg, trying to solve technical problems in existential biosecurity benefits more from skills built in a wet lab bio PhD than from trying to solve technical problems in existential biosecurity.
- The argument here is that climbing at least some ladders can be useful for getting valuable skillsets that can later be leveraged to doing EA work.
- Money
- Many places pay more than work at EA orgs.
- Money can be traded for goods and services.
- Either earning-to-give to buy units of caring directly or
- Giving yourself financial stability to do other really useful EA efforts
- Though by stipulation the hypothetical people here aren’t as entrepreneurial, so probably less useful?
Sidenote on climbing ladders and entrepreneurship within EA orgs
A reasonable counterconsideration here is that the two core skills I mentioned above (ability to climb prestige ladders and entrepreneurship) are also both useful skillsets for doing well within EA orgs as well.
I agree with this.
- a) the ability to climb prestige ladders is at heart somewhat indicative of broadly useful skills like
- i) communication skills
- ii) the ability to read and follow explicit and implicit rules/incentive gradients
- both of which I expect to be fairly useful for doing well in EA orgs/projects as well.
- b) entrepreneurship is more useful than a) for EA, as
- our existing orgs are quite small so somebody working as e.g. one of the first 3 web developers or researchers or HR people for an org can help
- set the website/research/HR direction for an org
- more broadly for EA overall.
- our existing orgs are quite small so somebody working as e.g. one of the first 3 web developers or researchers or HR people for an org can help
However, I think it’d be surprising if someone needs more entrepreneurship to do useful work within an established EA org than for doing extraordinarily useful intrapreneurship projects outside of EA orgs. (The difference here is similar to that between a startup founder and being employee #10 at a startup. Clearly Employee #10 needs to be quite entrepreneurial, but there’s a qualitative difference between skills needed for them and for the founder.) I expect even larger differential returns to ladder-climbing abilities.
My question again
So repeating my questions again:
- How valuable is it for EAs to climb career ladders outside of EA?
- Supposing that they’re not unusually capable of ladder-climbing or entrepreneurial, what percentage of people who can do both, should, in your all-things-considered view, work on a) ladder-climbing outside of EA vs b) work in an existing EA org?
---
Thanks to Lizka Vaintrob, Michael Sadowsky, Charles Dillon, Marie DB, Emily Grundy and others for feedback on early versions of this question. As usual, all errors are my own.