TL;DR: The EA movement has yet to fully incorporate ideas, skills, people, and knowledge from existing fields (e.g. monitoring and evaluation). You could potentially have a lot of impact just through stealing these and adapting them to our contexts.
Epistemic status: Quick informal post on something I've been thinking about.
I’m currently basing a large portion of my career (and more importantly, my impact) on stealing ideas, best practices, and people from existing sectors and applying them in new contexts:
- Within both The Mission Motor and my previous work, we’ve been taking monitoring and evaluation (M&E) standards from global development organisations and applying them to EA and animal organisations.
- I was able to step away from Fish Welfare Initiative, in part, because we hired someone with years of experience running research and programs in global development to replace my research-lead function.
I am founding a charity that takes strategies used by other agricultural development movements (e.g. the organic movements’ use of farmer co-operatives) and tests them for farmed animal welfare.[1]
Effective altruism is still relatively nascent. We also tend to attract people (like myself) who are young and don’t have much experience with best practices from other spaces. As a result, many of our projects run with a deficit of existing knowledge. This is a problem because:[2]
- We risk wasting resources answering questions that have already been answered elsewhere.
We risk doing things worse than we otherwise could, given existing expertise.[3]
- We risk our worse ways of doing things locking in as the norm.
So we should steal more.
This can look like hiring people with existing skillsets. But it can also look like current EAs upskilling. In my experience, even a relatively novice understanding of a relevant field or skill can quickly outpace current norms. I’ll put my advice for how to upskill in this way in a comment below.
I could imagine the main downsides here being forms of credentialism or a stifling of innovation. These are worth watching, but I suspect that we are currently (at least in my EA circles) under-indexing on existing knowledge. As a result, a moderate swing in that direction would likely be net-beneficial.
We might also worry that encouraging EAs to upskill in order to resolve these gaps invites a bastardised version of external expertise. I do think this has happened to some extent with M&E, where it has been conflated with existing EA concepts like cost-effectiveness projections. That said, I generally think that even half-informed learning is better than none. In my experience, EAs who push into these new areas often end up creating more space for existing professionals as well.
Some areas where I suspect there’s a lot more to be learned for EA and animal spaces:
- Financial modelling
- Science communication
- Project management
- Ethnography
- User experience
But I’m sure there’s much more. I’d be interested to see other people’s ideas in the comments.[4]
- ^
See my previous post for the rough idea. Let me know if you’d be interested in chatting about this!
- ^
Also see this post: EA needs outsiders with a greater diversity of skills
- ^
Example: EA on nuclear war and expertise
- ^
I suspect that a good place to start looking is anywhere that freelancers are contracting a specific skill.

My advice for EAs who want to skill up in a neglected area:
In general, when learning a new skill, Andrea Gunn’s talk on training leaders offers a lot of good insight. I also made a one-page summary of her talk.
I have historically been able to do this upskilling as a side project to my existing job.