Hide table of contents

I’ve read a lot about “talent gaps” and how EA needs more top researchers, operations professionals, and policy analysts. I don’t doubt those gaps are real. You can contribute to the EA community by developing those skills.

However, you can also contribute to the EA community by developing rarer skills. There are many particular skillsets which would be very useful for a few EAs to develop. Although these skills may never turn into full-time jobs, I'd like to see people develop them because I think they could be quite useful.

Some suggestions of skills I'd like a few people to develop:

· Connecting EA and another group. I'd like to see a small number of people connecting EA to each of these:

-Religions (eg Methodists, Bahai)

-Social movements (eg Fair Trade, Black Lives Matter, drug reform/prison reform movements)

-Underrepresented cultural or demographic groups (eg people who haven't gone to university, pensioners, Indonesians)

· Apply a methodology other than RCTs and Cost-Benefit Analysis to an EA problem. Can you measure large-scale education interventions using difference-in-difference? Are you a wizard at process tracing? We could really use your skills evaluating charity and policy interventions!

· Communicate important EA concepts. I'd like to see, for example, an illustrated guide to EA, or an infographic on charity evaluation. I'd like to see high-quality videos created promoting each of GiveWell's top charities. I'd like to see high-quality education within EA, and high-quality PR to the outside world. We only need a couple of people working on each of these at this point, but they could all be quite useful.

Helpful fields: adult education; online education; marketing; writing; lobbying; communications; PR; art; graphic design; video production.

· Global poverty that isn’t health. I'd like to see a handful of people in EA with expertise in, for example, climate policy, or education charities, or energy poverty in a developing world context.

· Research and management. If you are an 80th percentile researcher and 80th percentile manager, I salute you. You should definitely stick around.

· Corporate Social Responsibility. Working in corporate social responsibility in a large corporation could probably have quite an impact? I really don’t know, but I’ve never heard anyone in EA talk about it, so if someone investigated it that would be useful.

· Psychiatry. EA is full of mental health problems. You could help us fix them. Sure, another psychiatrist could help us, but you already understand our weird lingo and worldview. If you could sort out our mental health, we’d be very thankful.

This is just meant to be the start to a long list - what other skills would you like a small number of people in EA to develop?

New Answer
New Comment

5 Answers sorted by

Book up on tax law to offer targeted tax advice -- It’s plausible that many EAs often have financial situations more complicated than “I make $X/year in salary and I want to donate some % of it” while having much less money than Good Ventures (so having lawyers on retainer is not a live possibility).

  • Evidence for: Even somebody whose only job is at BigTech usually has relevant compensation in at least three different buckets (salary/bonuses, stock, 401K matching). I can imagine situations where an one hour consultation (or reading a 20 minute blog post) is at least as helpful as a 2-3 hour consultation with a tax attorney who does not have the relevant EA context, as well as possibly seeing things that some conventional tax attorneys would just miss.
  • Other evidence: complications from startup exits, cryptocurrency, consulting work

I can imagine that as a community we can support ~2-5 people specializing in US tax law as the movement grows, and maybe part-time people in the tax law of other countries (probably not a bad general option for earning-to-give if it turns out your time is only needed partway through the year).

Cognitive Enhancement through genetic engineering - Plausibly very important in general, but I think 1-5 people is a good start. When South Bay EA did a meetup about this, I think we’ve broadly concluded (note, did not formally poll. Just my read of the room) that both of the following statements seem to have a >50% chance of being true:

  • In an ideal world, it’s better to have human cognitive enhancement before AGI
  • If cognitive enhancement has to happen at all, it’s quite important that it’s done well.

I think it’s plausible (~30% credence, did not think about this too deeply) that human cognitive enhancement has comparable importance to bio x-risk, and I basically never hear about people going into it for EA reasons, possibly because of the social/political biases of being a mostly Western, center-left movement.

Farmed Animals Genetic Engineering - For a movement that pride ourselves on jokes about hedonium and rats on heroin, I don’t think I know anybody who works on genetically engineering animals to suffer less. This only matters in the conjunction of a) near time AGI doesn’t happen, b) farmed animal suffering matters a lot (in both a moral and epistemic sense), c) clean/plant-based meat will not have high adoption within a generation, d) it’s technically possible for you to engineer animals to suffer less in a cost-effective manner and e) there is enough leeway in the current system that lets you do so. Even with that in mind, I still think nonzero AR/AW people should investigate this. For d) in particular, I will personally be very surprised if you can’t engineer chickens to suffer 1% less given approximately the same objective external conditions, and will not be too surprised if you can reduce chicken suffering by 50%.

I think there are obvious biases for why animal rights activists go into clean meat rather than engineering animals to feel less pain, so the fact that this path probably does not currently exist should not be surprising.

Micro-optimizations for user happiness within large tech companies. A large portion of your screen time is spent in crafted interactions by a very small number of companies (FB, Google, Apple, Netflix etc). Related to the idea above of targeting animal happiness directly, why aren’t people trying harder to target human happiness directly? It seems like a fair number of EAs are interested in mental health, but all are trying to partly cure *major problems*, rather than consider that a .002 sd change in the happiness of a billion people is a ridiculously large prize.

I know exactly one person (working very part-time) on this. I think there’s a decent chance that a single person who knows how to Get Things Done within a large company can convince execs to let them lead a team to investigate this, and also a decent chance that this is very plausibly doable without substantial technological or cultural changes. These large tech companies already spend hundreds of millions of dollars (if not more) on other ethics initiatives like diversity, fairness, transparency, user privacy, preventing suicides etc. So it’s not at all crazy to me that somebody can manage upwards by crafting a convincing enough pitch* to launch something like this in at least one tech company.

Involvement in various militaries - Pretty speculative. I’ve talked to former (American) military members who think it’s not very impactful, but I still think that prima facie it’d be nice if we had very EA-sympathetic people within earshot of people high in the chains of command in, say, militaries of permanent UN Security Council Members, or technologically advanced places like the IDF.

Content creation/social media marketing. I have some volunteering experience in this, enough to know that this is a non-trivially difficult skill with large quality differences between people who are really good at this vs. average. EA does not currently want to be a mass movement (and probably never will), but assuming that this changes in the next 5-10 years(~15-20%?) , I think having 1-5 people who are good at this skill would be nice to have, and I’d rather not buy our branding on the market.

*Hypothetical example pitch: "we always say that we respect our users and want them to be happy. But as a data-driven firm, we can't just say this and not follow up with measurable results. Here are some suggested relevant metrics of user happiness (citations 1,2,3), and here's the pilot project that increased user happiness in this demographic by .0x standard deviations."

Related news for the suffering engineering idea (but sadly also related for the cognition engineering one).

Very cool!

Trades - plumbing, electrics, carpentry, general building and maintenance. I think there is room for at least 1 or 2 EAs with these kinds of skills. Maybe it's peculiar to our situation in the North-West of England, but we've found it very difficult to hire reliable and competent tradespeople to do jobs that need doing for the EA Hotel. The good people are always busy, and the rest are very unreliable (it's often a surprise if they turn up at all). Having someone aligned with our mission would greatly help with reliability and quality of outcome I imagine - this is an area where knowing the right person is a key factor. With the proliferation of other EA hubs (who may face similar challenges in terms of hiring quality tradespeople) there could be opportunities to travel, help build infrastructure, and get more involved in the community. Also, expected earnings are reasonably high, so a decent level of earning-to-give would be possible too.

This remains as true now as it was then. We are still struggling to find reliable tradespeople. The norm seems to be for tradespeople to take on many more jobs than they can do, make unrealistic promises (likely to many different people simultaneously), and just not turn up (more than) half the time. Many of our maintenance issues at CEEALAR (the EA Hotel) have been extensively prolonged because of this. To add to this we've had experiences of outright dishonesty and being left out of pocket paying for things that have never been delivered. Having community members with a shared common mission (EA) in these roles would help a lot with reliability.

Someone needs to start a website that is like Check-a-Trade, but where the tradespeople pay a deposit that they only get back if they show up, and payment for jobs is  (part-)dependent on them being completed on time.

Would be interested to hear why people think this is a bad idea.

[Not one of the downvoters]

The leading rationale of "Learn a trade --> use it for EA projects that need it" looks weak to me:

  • There's not a large enough density of 'EA' work in any given place to take up more than a small fraction of a tradepersons activity. So this upside should be discounted by (substantial) time to learn the trade, and then most of one's 'full time job' as (say) an electrician will not be spent on EA work.
  • It looks pretty unlikely to have 'nomadic' tradespeople travelling between EA hubs, as the added cost of flights etc. suggest it might be more efficient just to try and secure good tradespeople by (e.g.) offering above market rates.

As you say, it could be a good option for some due to good earning power (especially for those with less academic backgrounds, cf. kbog's guide) but the leading rationale doesn't seem substantial reason to slant recommendations (e.g. if you could earn X as a plumber, but 1.1X in something else, the fact they could occasionally help out for EA projects shouldn't outweigh this.

Good points. With growth of hubs it could become more viable even if it isn't now. Transport costs (time, money) are probably low enough to make it efficient to travel at least a few times a year. Offering/accepting above market rates might help a bit, but it would still require costs of the search and vetting. Given training costs and counterfactuals, another option might be to find good tradespeople and get them on board with the EA mission. (for the curious: kbog's guide)


This is more a 'skill I'd like to see more of in the EA community', rather than a career track. It seems a generally valuable skill set for a lot of EA work, and having some people develop expertise/very high performance in it (e.g. becoming a superforecaster) looks beneficial to me.


To those interested in becoming better forecasters: I strongly recommend the list of prediction resources that Metaculus has put together.

Do you see this as a niche skill for a few people to highly develop (like software development), or a skill all EAs should ideally develop (like statistics)?

Gregory Lewis
A bit of both: I'd like to see more forecasting skills/literacy 'in the water' of the EA community, in the same way statistical literacy is commonplace. A lot of EA is about making the world go better, and so a lot of (implicit) forecasting is done when deciding what to do. I'd generally recommend most people consider things like opening a Metaculus account, reading superforecasting, etc. This doesn't mean everyone should be spending (e.g.) 3 hours a day on this, given the usual story about opportunity costs. But I think (per the question topic) there's also a benefit of a few people highly developing this skill (again, a bit like stats: it's generally harder to design and conduct statistical analysis than to critique one already done, but you'd want some folks in EA who can do the former).

Ozzie Gooen has got funding "to build an online community of EA forecasters, researchers, and data scientists to predict variables of interest to the EA community". Excited to see the outcome of this.

Charity Law - Perhaps one EA in each of the 5 countries with the highest numbers of EAs. It would be useful to have people who are familiar with EA who can offer their professional services to help incorporate new projects as charities (even better if they can offer reduced rates or pro bono advice). This is another thing we've struggled with with the EA Hotel. Our situation is complicated by the fact that it doesn't neatly fit in with the UK Charity Commission's charitable purposes. We've been recommended a few law firms, but haven't yet been able to hire anyone or get any pro bono advice (not least because we aren't already a charity!)

Sorted by Click to highlight new comments since:

Strong upvote for the topic and examples. This seems like a clear area for progress within EA, so long as we continue to evaluate different interventions using an impartial model. (More education experts within EA gives us a better chance of noticing opportunities there, but could also lead to education interventions getting promoted despite lower impact depending on the incentives and priorities of the experts.)

If our goal is to improve the world as much as possible (and it is), the world is a big place with a lot of different groups. There are a lot of levers we could be trying to pull, and while we have a pretty good idea of which levers tend to be more or less important, the world is changing all the time (and even now, if we did have perfect information, our lists of "top causes" would probably look quite a bit different). I'd love to see more people becoming experts on a particular "lever", just in case.

The idea of having more connections to groups of people (vs. more access to causes, as I discussed above) is even more promising, though it's important to build EA communities // EA presence within a community slowly and carefully, with respect toward community norms and ideas. I've occasionally seen this go badly (e.g. individuals and groups who market EA too aggressively to a new audience, accidentally burning bridges in the process).

Yeah, ideally I'd like people to invest significant time into genuinely being part of two communities, rather than just "marketing EA." A good example of this would be EA Quakers. I've met a few; that don't explicitly talk about Quaker values at EA events or EA values at Quaker events, but they have a deep understanding of both perspectives.

I'd like to see a small number of people connecting EA to each of these... Social movements (eg Fair Trade, Black Lives Matter, drug reform/prison reform movements)

If you still hold this view, I'd be interested in why you'd like to see people "connecting EA to each of these." What are the benefits you expect? Is it for recruitment (i.e. getting them to switch causes), shared strategic knowledge, or something else? (If you don't still hold this view, I'd be interested in why not?)

I ask because I'm just finishing up a social movement case study on the Fair Trade movement; hence why I stumbled across this post from a year and a half ago.

I do! I think "shared strategic knowledge" is probably the best was of phrasing it, so thank you for that.

It's also good to have a wide network of people with similar goals who you can collaborate with, and not all of the people I'd like to collaborate with are still "Effective Altruists". It's nice to have someone who's a trusted member of both EA and another relevant movement, who can provide introductions and advice.

Re: CSR. George Howlett started Effective Workplace Activism a couple of years ago, but it didn't take off that much. Their handbook is useful.

I tried quite hard to change my large corporation's charity selection process (maybe 50 hours' work), but found the stubborn localism and fuzzies-orientation impossible to budge (for someone of my persuasiveness and seniority).

I'd imagine that "ease of changing selection process" varies considerably by company, so I'll also share my anecdote (though I don't doubt that it's often quite difficult):

The health software company I worked for in 2015-16 offered each employee something like $600 to donate to any charity in a long list they offered, almost all of which were connected somehow to the surrounding city/state or the topic of "healthcare". Using the latter category, I submitted several GiveWell/TLYCS charities for consideration, and one of those was added to the list.

This didn't make a huge difference in employees' overall allocation (the list was on the order of ~100 charities long), but because I'd been running a small EA group at the company, I was able to inform the members and some of my interested coworkers about the new option, so I'd guess we redirected $10,000-20,000 overall.

I think I have actually heard a couple people talk about trying to influence their workplace fundraisers etc - I guess I was thinking more about careers in CSR.

80K actually has a career profile for foundation grantmakers, which could be related to a career in CSR? I imagine in CSR you would come across the barriers of localism/feel-good charities mentioned by technicalities, but the article is a good starting place.

-Social movements (eg Fair Trade, Black Lives Matter, drug reform/prison reform movements)

I have been part of a few. Those perspectives are really useful.

· Global poverty that isn’t health. I'd like to see a handful of people in EA with expertise in, for example, climate policy, or education charities, or energy poverty in a developing world context.

Education and Human Development Indicators are something that EA needs to pick up.

No takers so far. As can be seen from the votes on my comments.

@Lucy I agree with you and also feel this is a hole. I'd love to discuss further. Add me on Linkedin here: https://www.linkedin.com/in/simongraffy/

Between RTI's Tusome in Kenya, Teaching At the Right Level, Phonic, LEAP in Liberia, Pratham, Mindspark and evidence on instructional coaching there are emergent interventions that could be funded. They are system dependent and complex but important and neglected most of the EA stuff I've found so far.

Curated and popular this week
Relevant opportunities