Darius M 🔸

Chief of Staff, Global Catastrophic Risks @ Coefficient Giving
1883 karmaJoined Working (6-15 years)Washington, DC, USA

Comments
95

Topic contributions
35

I joined Coefficient Giving a few weeks ago as Chief of Staff on the GCR executive team, and I've updated strongly upward on how much counterfactual impact each new hire here can have! The GCR team is stretched really thin relative to the problems we're aiming to solve, and some of the opportunities we can't currently pursue feel really painful to leave on the table. I'd really encourage people to apply: in my view, these are some of the most exciting roles in the ecosystem right now, and CG is also just a very fun place to work, with deeply caring, smart colleagues and a great internal culture.

One signal-boost: my own team is hiring a senior generalist as part of this round. Most of the round is grantmaker roles, but if you have a strong generalist skill set, please apply to our team!

I currently believe the most impactful marginal funding opportunities focus on improving the welfare of highly numerous but neglected classes of animals (especially wild animals, shrimp, and invertebrates). As a longtermist, my work focuses on existential risk reduction, but my sense is that the key existential risk-related causes (e.g. AI safety, biosecurity) are relatively well-funded compared to the highest priority animal welfare causes, so I choose to donate there.

Darius M 🔸
2
0
0
100% agree

My cause prioritization would be much worse and I would be much less morally ambitious. I also received tremendous support in my career from other people in the EA community, which made an extremely large impact on my own career opportunities and my ability to do good.

As a caveat, there are some nuances to Wikipedia editing to make sure you're following community standards, which I've tried to lay out in my post. In particular, before investing a lot of time writing a new article, you should check if someone else tried that before and/or if the same content is already covered elsewhere. For example, there have been previous unsuccessful efforts to create an 'Existential risk' Wikipedia article. Those attempts failed in part because relevant content is already covered on the 'Global catastrophic risks' article.

One other relevant resource I'd recommend is Will and Toby's joint keynote speech at the 2016 EA Global conference in San Francisco. It discusses some of the history of EA (focusing on the Oxford community in particular) and some historical precursors: https://youtu.be/VH2LhSod1M4

I enjoyed reading this and would love to see more upbeat and celebratory posts like this. The EA community is very self-critical (which is good!) but we shouldn't lose sight of all the awesome things community members accomplish.

I recently had to make an important and urgent career decision and found it tremendously valuable to speak with several dozen wonderful people about this at EA Global SF. I'm immensely grateful both to the people giving me advice and to CEA for organizing my favorite EA Global yet.

Going very broad, I'd recommend going through the EA Forum Topics Wiki and considering the concepts included there. Similarly, you may look at the posts that make up the EA Handbook and look for suitable concepts there.

Load more