Hide table of contents

I wrote this post for my personal Facebook and it was well received, so I thought it could be useful to people here on the EA Forum as well.

Colourful brain

My impression is that many people whose top career goal is 'improve the long-term future of humanity' are overly focused on working at a handful of explicitly EA/longtermist/AI-related organisations.

Some of those projects are great but it would be both crazy and impossible to try to cram thousands of people into them any time soon.

They're also not the natural place for most people to start their career, even if they might want to work at them later on.

The world is big, and opportunities to improve humanity's long-term prospects are not likely to be concentrated in just a handful of places we're already very familiar with.

Folks want to work on these projects mostly because they are solid opportunities to do good, but where does the narrow focus on them come from? I'm not sure, but some drivers might include:

  • They mostly publish and promote what they do, making them especially visible online.
  • It's fun to work with colleagues you already know, or who share your worldview.
  • They don't require people to pioneer their own unique path, which can be intimidating and just outright difficult.
  • They feel low-risk and legitimate. People you meet can easily tell you're doing something they think is cool. And you might feel more secure that you're likely doing something useful or at least sensible.
  • 80,000 Hours and others have talked about them more in the past.

For a while we've been encouraging readers/listeners to broaden the options they consider beyond the immediately obvious options associated with the effective altruism community. But I'm not always sure that message has cut through enough, or been enough to overcome the factors above.

I worry the end result is i) too little innovation or independent thinking, ii) some people not finding impactful jobs as they keep applying for a tiny number of positions they aren't so likely to get or which aren't even a good fit, and iii) people building less career capital than they otherwise might have.

Additional problems

First, to give readers some ideas, 80,000 Hours recently put up this list of problems which might be as good to work in as the 'classics' we've written on the most:

  • Measures to reduce the chance of ‘great power’ conflicts
  • Efforts to improve global governance
  • Voting reform
  • Improving individual reasoning
  • Pioneering new ways to provide global public goods
  • Research into surveillance
  • Shaping the development of atomic scale manufacturing
  • Broadly promoting positive values
  • Measures to improve the resilience of civilization
  • Reduction of s-risks
  • Research into whole brain emulation
  • Measures to reduce the risk of stable totalitarianism
  • Safeguarding liberal democracy
  • Research into human enhancement
  • Designing recommender systems at top tech firms
  • Space governance
  • Investing for the future.

The write-up on each is brief, but might be enough to get you started doing further research.

Additional career paths

Second, there's a new list of other career paths we don't know a tonne about or which are a bit vague, but we expect at least a few readers should take on:

  • Become a historian focusing on large societal trends, inflection points, progress, or collapse
  • Become a specialist on Russia or India
  • Become an expert in AI hardware
  • Information security
  • Become a public intellectual
  • Journalism
  • Policy careers that are promising from a longtermist perspective
  • Be research manager or a PA for someone doing really valuable work
  • Become an expert on formal verification
  • Use your skills to meet a need in the effective altruism community
  • Nonprofit entrepreneurship
  • Non-technical roles in leading AI labs
  • Create or manage a long-term philanthropic fund

There must be other things that should go on these lists — and some that should come off as well — but at least they're a start.

Again the description of each a brief, but are hopefully a launching pad for people to do more investigation.

(Credit goes to Arden Koehler for doing most of the work on the above.)

Additional jobs

Third, I don't know what fraction of people have noticed how many positions on our job board are at places they haven't heard of or don't know much about, and which have nothing to do with EA.

Some are great for directly doing good, others are more about positioning you to do something awesome later. But anyway, right now there's:

  • 131 on AI technical and policy work
  • 66 on biosecurity and pandemic preparedness
  • 11 on institutional decision-making
  • 95 on international coordination
  • 34 on nuclear stuff
  • 37 on other random longtermist-flavoured stuff

We've only got one person working on the board at the moment, so it's scarcely likely we've exhausted everything that could be listed either.

If nothing there is your bag maybe you'd consider graduate study in econ, public policy, security studies, stats, public health, biodefence, law, political science, or whatever.

Alternatively, you could develop expertise on some aspect of China, or get a job with promotion possibilities in the civil service, etc, etc.

Which also reminds me of this list of ~50 longtermist-flavoured policy changes and research projects which naturally lead to lots of idiosyncratic career and study ideas.

Anyway, I'm not saying if you can get a job at DeepMind or Open Philanthropy that you shouldn't take it — you probably should — just that the world of work obviously doesn't start and end with being a Research Scientist at DeepMind or a Grant-maker at Open Phil.

There's ~4 billion jobs in the world and more that could exist if the right person rocked up to fill them. So it's crazy to limit our collective horizons to, like, 5 at a time.

As I mention above, some of these paths can feel riskier and harder going than just working where your friends already are. So to help counter that, I suggest paying a bit more respect to the courage or initiative shown by those who choose to figure out their own unique path or otherwise do something different than those around them.

———

P.S. There's also a bunch of problems that some other people think are neat ways to improve our long-term trajectory about which I'm personally more skeptical — but maybe you agree with them not me:

  • More research into and implementation of policies for economic growth
  • Improving science policy and infrastructure
  • Reducing migration restrictions
  • Research to radically slow aging
  • Improving institutions to promote development
  • Research into space settlement and terraforming
  • Shaping lie detection technology
  • Finding ways to improve the welfare of wild animals
Comments6
Sorted by Click to highlight new comments since: Today at 11:40 PM

Another potential cause of the narrow focus, I think, is some people in fact expecting the vast majority of impact to be from a small group of orgs they mostly already know about. Curious whether you disagree with that expectation (i.e., you think the impact distribution of orgs is flatter than that), or whether you're just claiming that e.g. the distribution of applicants should be flatter regardless?

It could also be the case that the impact distribution of orgs is not flat yet we've only discovered a subset of the high impact ones so far (speculatively, some of the highest impact orgs may not even exist yet). So if the distribution of applicants is flatter then they are still likely to satisfy the needs of the known high impact orgs and others might end up finding or founding orgs that we later recognise to be high impact.

This is a great post, thanks for writing this up! 

I agree with the main point, and 80,000 Hours' webpage does make it clear that their top career recommendations (and the specific jobs in these areas that are highly concentrated in a few organizations) are pretty competitive, and most people on the EA movement are not going to be able to get into one of those. When  planning my career, I factor in this possibility, but one problem I face is that I don't feel I know enough about these other possibilities, and so there is a lot of uncertainty when I think about what should I do outside of the top career paths and top organizations.

I don't think the solution to this problem is for 80,000 Hours to try to discuss other problem areas and mention other EA-aligned organizations in more detail, because that would take a lot of effort. One thing that could be helpful, though, is to emphasize more the process people should go through when planning their careers, with more guidance on how to tackle problem areas that haven't been explored in much detail, how to explore areas that an EA think might be relevant but hasn't been explored at all, how to find organizations to work for in the problem areas they are interested in, and what to do if you can't get a job at an organization you really want to work for in the long term. 

I believe it would also help to share the trajectories of people in the EA community who have done some innovative work, or people who managed to find jobs at EA-aligned organizations that the movement was previously unaware of, emphasizing how they approached the task. Facilitating networking between people in a certain problem area could also prove really helpful.

I'm not saying there isn't any content in these topics, just that in my experience writing up and improving my own career plan over a few years I found it much easier to find EA material on why should I take a certain career path than on how to do it  more concretely (besides working at top career paths and organizations), and that based on my experience  I believe emphasizing more these aspects could go a long way into helping people structure better career plans.

Hey, if anyone is interested or already immersed in engineering physical goods or supply chain/logistics as their skillset, I want to be your buddy. DM me!

Designing recommender systems at top tech firms

Semi-related and somewhat off-topic, so forgive me for following that different track – but I recently thought about how one of the major benefits of EAGx Virtual for me was that it worked as a recommender system of sorts, in the form of "people reading my Grip profile (or public messages in Slack) and letting me know of other people and projects that I might be interested in". A lot of "oh you're interested in X? Have you heard of Y and Z?" which often enough led me to new interesting discoveries.

I'm curious if there may be a better approach to this, rather than "have a bunch of people get together and spontaneously connect each other with facts/ideas/people/projects based on mostly random interactions". This current way seems to work quite well, but it is also pretty non-systematic, luck-based, and doesn't scalethat well (it kind of does, but only in the "more people participate and invest time -> more people benefit" kind of way).

(That all being said, conferences obviously have a lot of other benefits than this recommender system aspect; so I'm not really asking whether there are ways to improve conferences, but rather whether there are different/separate approaches to connecting people with the information most relevant to them)

What about jobs in the field of education? I feel like there is a lack of discussion on teaching/working at public schools and its impact...I work at elementary school and oftentimes I feel (and understand to some extent) that these types of jobs appeal rather “unattractive” to the public. Even I ask myself, is my role really necessary? My role might soon be completely redundant. Especially in our technological age where all the information is available at our fingertips. And it seems like it's more and more going this way - remote teaching, home-schooling, smaller learning groups, etc. I am very interested in how the education system will evolve.

Curated and popular this week
Relevant opportunities