S

Sunny1

66 karmaJoined Mar 2020

Comments
14

In my view one of the most defining features of the EA community is that it makes most people who come into contact with it feel excluded and "less than," on several dimensions. So it's not just you!

I think the centrality of the EA Forum to the overall "EA project" has likely caused a lot of unintended consequences like this. Participating in the Forum is seen as a pretty important "badge" of belonging in EA, but participating in an internet forum is generally not the type of activity that appeals to everyone, much less an internet forum where posts are expected to be lengthy and footnoted.

Great post. I appreciate the framing around the real gaps in human capital. One additional concern I have is that aesthetics might play a counterproductive role in community building. For example, if EA aesthetics are most welcoming to people who are argumentative, perhaps even disagreeable, then the skill set of "one-on-one social skills and emotional intelligence" could be selected out (relatively speaking). 

Just as a casual observation, I would much rather hire someone who had done a couple of years at McKinsey than someone coming straight out of undergrad with no work experience. So I'm not sure that diverting talented EAs from McKinsey (or similar) is necessarily best in the long run for expected impact. No EA organization can compete with the ability of McK to train up a new hire with a wide array of generally useful skills in a short amount of time. 

Well, no one has the "real" answers to any of these questions, even the most EA of all EAs. The important thing is to be asking good questions in the first place. I think it's both most truthful and most interpersonally effective to say something like "gee, I've never thought about that before. But here's a question I would ask to get started. What do you think?"

I really liked CEA's "tour of duty" framing for the recent hiring rounds!  I thought that was a great signal to potential candidates of what they could expect in the job. I think employers should be more explicit with candidates about what they're expecting from new hires in terms of tenure.

Conversely, I would encourage job applicants to be candid with employers about their tenure expectations for a role. For some roles, only staying in the role for 1-2 years is perfectly fine. For other roles, especially ones that require a lot of role-specific knowledge and training, that would be harmful to the organization. I also would ask candidates to introspect honestly - do they feel that a certain role is in any way "beneath them?" It's can be damaging to morale to have a colleague who feels like the job is just a stepping stone to something else. 

Strong +1 for squad goals :)

I just wanted to reinforce the point Benjamin made above about getting involved in the EA community. For example, if you apply for a job at an EA organization, they may request references from the EA community in addition to the standard references from your last job. Do you already have strong references from credible people in the EA community? If not, it would be worthwhile to do more networking. You may also need to build up a track record of EA volunteer work, post on the EA forum, and so on to build up your own EA  track record.

Here's one way to think about this. Getting a job at an EA organization can be like getting a job in the film industry. You're trying to break into a "glamorous" industry. That is, some people consider these jobs "dream jobs" - they have an extremely compelling "X factor" that has nothing to do with how much the job pays. (In EA, the 'glamour' factor is the ability to have a really high-impact career, which is the central life goal of many EAs.) So you may need to network, volunteer for a while, etc. in order to break in. 

Congratulations!  I'm very excited about this project and I'm looking forward to following along. A couple of questions:

  1. Do you plan to do any thinking / writing about why EA's might choose to prioritize climate giving? Climate seems to occupy a weird space in EA. It's more "middle termist" than short-termist or long-termist, so it doesn't neatly fit into either of those camps. And climate affects animals, humans living today, and future humans, so it also doesn't align neatly with those camps either. It's a bit of an "all of the above" cause area. Any thoughts on this topic, or are you just planning to be a resource for those who already care about climate? 
  2. What is your plan for influencing donors /  moving money? For context GiveWell moved about $150M in 2019 and it's taken them about 12 years to reach that milestone. What are your plans for translating your research into dollars moved?

Thanks and congrats again.

Thanks for this post. In my view, one of the most important elements of the EA approach is the expanding moral circle. The persistence of systemic racism in the US (where I live) is compelling evidence that we have a long way to go in expanding the moral circle. Writ large, the US moral circle hasn't even expanded enough to include people of different races within our own country. I think this inability to address systemic racism is an important problem that could have a negative effect on the trajectory of humanity. In the US, it's obvious that systemic racism impedes our ability to self-govern responsibly. Racial animus motivates people toward political actions that are harmful and irrational. (For a recent example, look at the decision to further restrict H-1B visas.) Racism (which may manifest as xenophobia) also impedes our ability to coordinate internationally - which is pretty important for responding to existential risks. So I tend to think that the value of addressing systemic racism is probably undervalued by the average EA.

Editing to add another dimension in which I think systemic racism is probably undervalued. If you believe that positive qualities like intelligence are distributed across different racial groups (and genders, etc), then you would expect roles of leadership and influence to be distributed across those groups. To me, the fact that our leadership, etc. is unrepresentative is an indicator that we are not using our human capital most effectively. We are losing out on ideas, talents, productivity, etc. Efforts to increase representation (sometimes called diversity efforts) can help us get a more efficient use of human capital.

I also just think racial equality is a good in itself - but I'm not a strict consequentialist / utilitarian as many in EA are.

Load more