I really liked CEA's "tour of duty" framing for the recent hiring rounds! I thought that was a great signal to potential candidates of what they could expect in the job. I think employers should be more explicit with candidates about what they're expecting from new hires in terms of tenure.
Conversely, I would encourage job applicants to be candid with employers about their tenure expectations for a role. For some roles, only staying in the role for 1-2 years is perfectly fine. For other roles, especially ones that require a lot of role-specific knowledge and training, that would be harmful to the organization. I also would ask candidates to introspect honestly - do they feel that a certain role is in any way "beneath them?" It's can be damaging to morale to have a colleague who feels like the job is just a stepping stone to something else.
Strong +1 for squad goals :)
I just wanted to reinforce the point Benjamin made above about getting involved in the EA community. For example, if you apply for a job at an EA organization, they may request references from the EA community in addition to the standard references from your last job. Do you already have strong references from credible people in the EA community? If not, it would be worthwhile to do more networking. You may also need to build up a track record of EA volunteer work, post on the EA forum, and so on to build up your own EA track record.
Here's one way to think about this. Getting a job at an EA organization can be like getting a job in the film industry. You're trying to break into a "glamorous" industry. That is, some people consider these jobs "dream jobs" - they have an extremely compelling "X factor" that has nothing to do with how much the job pays. (In EA, the 'glamour' factor is the ability to have a really high-impact career, which is the central life goal of many EAs.) So you may need to network, volunteer for a while, etc. in order to break in.
Congratulations! I'm very excited about this project and I'm looking forward to following along. A couple of questions:
Thanks and congrats again.
Thanks for this post. In my view, one of the most important elements of the EA approach is the expanding moral circle. The persistence of systemic racism in the US (where I live) is compelling evidence that we have a long way to go in expanding the moral circle. Writ large, the US moral circle hasn't even expanded enough to include people of different races within our own country. We permit system racism to continue because we can't muster the political will to end it. I actually think this inability to address systemic racism is a critically important problem that could have a negative effect on the trajectory of humanity. In the US, it's obvious that systemic racism impedes our ability to self-govern responsibly. Racial animus motivates people toward political actions that are harmful and irrational. (For a recent example, look at the decision to further restrict H-1B visas.) Racism (which may manifest as xenophobia) also impedes our ability to coordinate internationally - which is pretty important for responding to existential risks! So I tend to think that the value of eliminating systemic racism is probably substantially undervalued by the average EA.
Editing to add another dimension in which I think systemic racism is probably undervalued. If you believe that positive qualities like intelligence are evenly distributed across different racial groups (and genders, etc), then you would expect roles of leadership and influence to be distributed in a representative fashion across those groups - barring systemic prejudice. To me, the fact that our leadership, etc. is not representative is a strong indicator that we are not using our human capital most effectively. We are losing out on ideas, talents, productivity, etc. So efforts to increase accurate representation (sometimes called diversity efforts) aren't just for the benefits of underrepresented people. We all benefit from a more efficient use of human capital. So this is another way in which ending systemic racism could have substantially positive flow through effects.
I also just think racial equality is a good in itself - but I'm not a strict consequentialist / utilitarian as many in EA are.
Agreed. And, it would be great to have a similar top-level post for the "new" GWWC once it launches describing what is in and out of scope. In particular, it would be helpful to know if GWWC is intended to be 1) an EA recruitment pipeline; 2) an end in itself, i.e., driving impact through donations; or both? It seems that charitable giving has fallen out of favor relative to changing careers as an impact lever since I pledged in 2015. I'm curious to know if the leaders of CEA / GWWC see its mission primarily as driving charitable giving or as recruiting new EAs.
Thanks for the question as it caused me to reflect. I think it is bad to intentionally misrepresent your views in order to appeal to a broader audience, with the express intention of changing their views once you have them listening to you and/or involved in your group. I don't think this tactic necessarily becomes less bad based on the degree of misrepresentation involved. I would call this deceptive recruiting. It's manipulative and violates trust. To be clear, I am not accusing anyone of actually doing this, but the idea seems to come up often when "outsiders" (for lack of a better term) are discussed.
I also think, at least in the past, the attitude towards climate work has been vaguely dismissive.
As somewhat of an outsider, this has always been my impression. For example, I expect that if I choose to work in climate, some EAs will infer that I have inferior critical thinking ability.
There's something about the "gateway to EA" argument that is a bit off-putting. It sounds like "those folks don't yet understand that only x-risks are important, but eventually we can show them the error of their ways." I understand that this viewpoint makes sense if you are convinced that your own views are correct, but it strikes me as a bit patronizing. I'm not trying to pick on you in particular, but I see this viewpoint advanced fairly frequently so I wanted to comment on it.
Thanks for posting! I took an En-ROADS workshop with a trained facilitator in my local community and I thought it was extremely well done. The organization that built En-ROADS trains facilitators to then teach others about the tool (and about climate).
En-ROADS itself is an example of an intervention whose impact would be difficult to quantify. The goal is to educate as many people as possible about the fundamental dynamics of the climate problem, using well-designed interactive workshops/tools that are based on robust evidence. It seems like a good approach to me, but I don't know if they can ever prove a positive impact on the climate problem. I sometimes wonder if a similar approach would be helpful for spreading the "EA gospel" to a wider audience.