Through overpopulation and excessive consumption, humanity is depleting its natural resources, polluting its habitat, and causing the extinction of other species. Continuing like this will lead to the collapse of civilisation and likely our own extinction.
This one seems very common to me, and sadly people often feel fatalistic about it.
Two things that feeling might come from:
It's really cool to see these laid out next to another like this! Thanks for posting Katja :)
Makes sense! FWIW, I really enjoyed reading your post. There’s definitely something nice about how listing specific vacancies forces us to get down to get really concrete about what all this theorising actually means, even though doing so has been a bit challenging sometimes!
Thanks for the post Henry! I work at 80,000 Hours and have thought a little bit (along with Maria) about some of the indirect effects of the job board recently - especially about the degree to which it’ll be seen as representing our all-considered views of the best jobs. So it’s good to have some discussion of it!
Like you, I’m really excited about people using the job board to expand their ideas of what EA/long termist roles can look like, especially to types of roles which don’t have (something like) “effective altruism” somewhere in the name. Rob wrote a bit more about this here.
That being said, I do share many of Habryka, Aidan and Ben’s concerns about people thinking of it as representative of good opportunities in EA. It’s missing roles which orgs don’t advertise, lots of opportunities at early stage orgs, roles you design yourself and doesn’t foreground graduate school enough (yet!).
You can read more about In the user guide/FAQ about how we hope for people to think about the roles we list. In particular, I’m keen for people to keep this in mind:
“there is a good chance that your best option is actually a role that is not featured on the board. If you find a role that seems promising but is not listed on our board, you should not infer that it is less promising than the roles that we do feature.
I’m Brenton from 80,000 Hours - thanks for writing this up! It seems really important that people don’t think of us as “tell[ing] them how to have an impactful career”. It sounds absolutely right to me that having a high impact career requires “a lot of independent thought and planning” - career advice can’t be universally applied.
I did have a few thoughts, which you could consider incorporating if you end up making a top level post. The most substantive two are:
Many of the priority paths are broader than you might be thinking:
Most people won’t be able to step into an especially high impact role directly out of undergrad, so unsurprisingly, many of the priority paths require people to build up career capital before they can get into high impact positions. We’d think of people who are building up career capital focused on (say) AI policy as being ‘on a priority path’. We also think of people who aren’t in the most competitive positions as being within the path
For instance, let’s consider AI policy. We think that path includes graduate school, all the options outlined in our writeup on US AI policy and the 161 roles currently on the job board under the relevant filter. It’s also worth remembering that the job board has still left most of the relevant roles out: none of them are congressional staffers for example, which we’d also think of as under this priority path.
A significant amount of our advice is designed to help people think through how to approach their careers, and will be useful regardless of whether they’re aiming for a priority path.
In our primary articles on how to plan your career, we spend a lot of time talking about general career strategy and ways to generate options. The articles encourage people to go through a process which should generate high impact options, of which only some will be in the priority paths:
Unfortunately, there’s something in the concreteness of a list of top options which draws people in particularly strongly. This is a communication challenge that we’ve worked on a bit, but don’t think we have a great answer to yet. We discussed this in our ‘Advice on how to read our advice’. In the future we’ll add some more ‘niche’ paths, which may help somewhat.
A few more minor points:
Thanks for this post - I agree with your main point that there are many ways to contribute without working at organisations that explicitly identify with the effective altruism community, as would the rest of 80,000 Hours (where I work). In fact, I might go further in emphasising this.
The overwhelming majority of high impact roles in the world lie outside those organisations – with governments, foundations, intergovernmental agencies, large companies and, as you point out, academia. The majority of people interested in effective altruism should be taking roles in places like these, not EA orgs. Unfortunately, when we highlight specific roles there’s a bias towards opportunities we know about due to our involvement in the community, but where we’ve managed to correct for that (such as in the AI strategy and governance problem area of our job board) it’s clear that there are lots of valuable roles focusing on our top problems at a wide range of organisations.
I agree that when considering their future career path, people should think about what skills and expertise they already have(link, link.)That might mean - if you’re enjoying and succeeding in your current path - staying there and using that position to influence your field / company in a positive direction. Though it also might also mean thinking about how your skills might translate to other effective careers. For example, governments tend to be keen to hire people with science PhDs or tech skills, as shown by things like the AAAS fellowship and Tech Congress in the US. These don’t tend to feel like a natural step from a PhD, but being a scientific adviser in government seems plausibly pretty high leverage.
Since you mentioned academia, I thought readers might be interested in a few resources that might be useful for them if they’re looking to influence their academic field. There’s a Facebook group for EA academics to share what they’re working on and help each other. Luke Muehlhauser wrote an excellent report on cases where people successfully and unsuccessfully tried to deliberately build new fields. One case study that's particularly interestingly is that of neoliberal economics (written up compellingly by Kerry Vaughan), which is often held up as a great example of what can be achieved through careful work both within academia and with the people who disseminate ideas – journalists, authors, think tanks etc. Finally, there’s our career review.
A few nice examples I've seen along these lines:
ACE's graphs on how relatively neglected farm animal welfare is.
Wait But Why on putting time in perspective.
A bunch of art on space, of which this clip of the virgo supercluster is an example.
And my favourite - 'If the Moon Were Only 1 Pixel - a tediously accurate scale model of the solar system'.
Thanks Catherine. I’m going to quote the relevant part of my conclusion here, as I think the overall results of high school outreach are one of the most remarkable things to have come out of this review, but they haven’t seen any discussion here so far.
I’ve been very surprised at how little measured success high school EA outreach efforts have yielded. This post has compiled evidence from many competent people trying out multiple different methods, which in total have had over 5 years of full time equivalent work go into them. This has resulted in:
Three students becoming counterfactually interested in EA enough that they became involved in university groups or made a career change. I would guess that this work (mostly Catherine’s) accounts for the majority (>75%) of the measured success.
10-20 students becoming counterfactually interested in EA enough to reduce their meat consumption or start fundraisers for ACE or GiveWell recommended charities.
<$20,000 USD raised for ACE or GiveWell recommended charities.
"I don't think I'm assuming that."
That's fair - my bad.I think that it felt worthwhile making this point because an obvious response to your conclusion that "demand for jobs at professional EA organizations will continue to be very high" is to not worry if demand for these jobs drops. Or one could go further, and think that it would be good if demand dropped, given that there are costs to being an unsuccessful applicant. I appreciate that you're agnostic on whether people should have that response, but I personally think it would be bad - in part due to the reasoning in my previous comment.
[I work at 80,000 Hours]
It seems like you’re assuming that it would be better if EA organisations could make their jobs less desirable, in order to put off applicants so that the jobs would be less competitive. That doesn’t seem right to me.
Making the jobs less desirable is likely to either put off applicants at random, or even disproportionately put off the most experienced applicants who are most picky about jobs. That would seem reasonable to do if EA orgs were getting plenty of applicants above the bar to hire, and didn’t think there would be much difference in job performance amongst them. But that doesn’t seem to be the situation these organisations are reporting. Given that, we'd expect that reducing applicants by making the jobs less desirable would harm the beneficiaries of the organisations.