Hide table of contents

Epistemic Status: I haven't run this past anyone, I'm just bashing this out on a plane. Take this as Ruby's tentative opinion and note that Ruby hasn't spent that much time considering EA career advice. 

In the past couple of months, I've attended an EA student retreat, EAGxOxford, and EAGxBoston (on my way to EAG London now). At all of these, I have encountered a lot of students who are trying to figure out how to have to impact once they graduate.

Overwhelmingly these students are seeking immediate direct impact options; primarily in AI Safety, Biosecurity, or Community building. There are a decent number of roles available, but not as many as the number of people I'm meeting.

Something I am not hearing really at all, though it has been advocated before, is that people seek out regular industry jobs where they will grow and learn a lot. My name for this is Earning to Skill.

The Earning part doesn't actually matter anymore – it just allows for a more catchy name that riffs off Earning to Give. [I will tentatively say that Earning to Give is dead. If you'd be donating to longtermist/x-risk causes, I'd be surprised if you can find worthy donation targets the current major funders or their many regranters wouldn't already fund if they knew about them (so just tell them).]

[I retract that. The Earning part matters for building up your own financial reserves which will give you future freedom to explore more options. It's great to not have to get a job or apply for a grant. Financial security is also good for psychological well-being]

Skills definitely matter. There's a lot you don't get from university degrees that you will learn from spending a few years in an at least moderately functional workplace with good mentorship. And while I'm open to hiring people straight out of university if they seem promising enough, it's much easier to judge and be confident of someone's abilities after they've worked for a few years.

Personally, when I graduated in 2015 I correctly judged that I wasn't that useful to direct EA work. And I think that I further correctly judged that I would learn and grow faster in an industry job than EA work. Instead of trying to get a job at CFAR, I got a Data Science job. The Data Science job became a Product Management job. After a couple of years at that, I felt I wasn't growing at my company anymore, plus I felt "useful", so I quit and explored other options. Soon after I joined the LessWrong 2.0 team.

I still feel that an outsized proportion of the skills I'm using day-to-day now (I'm currently Team Lead for LessWrong), I gained in those two years of industry work.

Some things I learned:

  • How to manage my time among a large number of varying-size tasks for varying importance and urgency
  • How to communicate information to different people
  • How to have working relationships with people
  • How to prioritize
  • How to collaborate with others on complex projects
  • What function and dysfunction look like within a company
  • How to be resilient/scrappy/perseverant
  • How to resolve conflict with coworkers
  • How to model my coworkers
  • How to negotiate both with people inside and outside the company
  • and probably a bunch of others things I haven't gotten names for
  • A bunch of technical Data Science skill

I benefitted heavily from the mentorship I had there (two different very good mentors), the likes of which I have not had since switching to direct work, and also exposure to a wide variety of skilled professionals with many years of experience.

Mentorship is great because your mentors can point your blindspots to you, i.e. things you really need to work on but you don't even realize how bad you are at them. It's excellent if your mentor is strong at your blindspot/weakness and coaches you in it (I got a lot of this from one of my mentors on social relationships).

Even if you predict short timelines, I'd wager that for many people, 1-3 years spent in a good industry workplace environment will cause them to have greater lifetime contribution to the world than if they scrounged around for a direct impact job that wasn't that good.

This isn't a universal prescription, of course. The best thing for people will immensely depend on them, their circumstances, and opportunities, but I'd at least like to hear people who are very uncertain of what to do considering Earning to Skill.

And I'd like to hear people respecting this as actually a pretty good option for people to take up. If someone takes up a hard job where they'll learn, they'll get my respect at least.


Which industry jobs should I maybe seek out?

I think which particular domain matters less than that the job is hard and the company is competent on at least some dimensions. By hard I mean "solving hard problems like running a startup" rather than "you have to wake up at 5am". I think doing hard things is healthy for growth. Take on a bit too much responsibility (not too much, but enough that it's uncomfortable – better to not feel qualified).

You also want to be learning from people and systems are doing something right. If a company is selling a product or service and making money, probably doing something right. (And it's a great exercise to help identify what they're doing wrong and help them fix it.) If a think tank is successfully influencing policy, probably doing something right. Note I don't expect to find a company that's doing everything right. 

It is important to determine which aspects of a company are healthy and which are dysfunctional. No company I'm aware of has been free of dysfunction, but you can still learn from them.

What if people experience value drift and don't ever switch to impactful work?

I think it's a risk worth taking, but also something that can be mitigated by staying socially tied to EA. Attend meetups (start one at your workplace?[1], read the EA Forum and LessWrong (even better, write posts), attend EAGs, etc.

  1. ^

    Facebook/Meta, Google, and Bloomberg all have EA groups, I believe.





More posts like this

Sorted by Click to highlight new comments since:

I will tentatively say that Earning to Give is dead

ETG is not dead, you shouldn't declare it dead. Besides being false, as I argue below it's also de-motivating for the 15-20% of surveyed EAs pursuing this path.

ETG has a strong claim on being the most good you can do for the world.

  1. It is clearly highly valuable if you are not fully long-termist (perhaps because you do not have total population ethics).

  2. Even if you are longtermist, there is still a strong claim for the value of ETG.

It has been de-prioritized by 80k, but they have a specific argument for this that depends on a set of strong claims and a particular (longermist) moral framework. Among other things...

To a significant extent, this follows from the fact that the most promising causes tend to be talent-constrained rather than funding-constrained

But there is a strong case that talent can be bought with funding, and more could be done here.

See recent posts:




Thank you for the detailed reply!

I agree that Earning to Give may make sense if you're neartermist or don't share the full moral framework. This is why my next sentence beings "if you'd be donating to longtermist/x-risk causes." I could have emphasized these caveats more.

I will say that if a path is not producing value, I very much want to demotivate people pursuing that path. They should do something else! One should only be motivated for things that deserve motivation.

I've looked at the posts you shared and I don't find them compelling. 

I think the best previous argument for Earning to Give is that you as a small donor might be able to fund things that the major funders won't or can't, but my current sense is that bar is sufficiently low that it is now very hard to find such opportunities (within the x-risk/lontermist space and framework at least). Things that seem like remotely a good idea get funding now.

I think that the reason we're not hiring more people isn't for lack of money, as discussed on that post.

There might be crazy future scenarios where EA suddenly needs a tremendous amount of money, more than all the funders currently have (or will have), in which case additional funds might be useful, but...it seems if we really thought this was the case, the big funders should raise the bar and not fund as many things as generously as they do.

I agree that Earning to Give may make sense if you're neartermist or don't share the full moral framework. This is why my next sentence beings "if you'd be donating to longtermist/x-risk causes." I could have emphasized these caveats more.

OK. I guess it would be better to have phrased it a little differently ... make it more like 'my belief is, and the consensus of people I've spoken with ... in the context of longtermist and x-risk causes'

I will say that if a path is not producing value, I very much want to demotivate people pursuing that path. They should do something else! One should only be motivated for things that deserve motivation.

I agree with this, which is why I also said something like 'and I think ETG actually has great value'

I've looked at the posts you shared and I don't find them compelling.'

What about the LW post? That seems like the most compelling one to me that 'actually you probably could use more money to hire better people into AI research etc, it just isn't being done right'.

My basic skepticism is sort of a classical economics argument. Unless intrinsic motivation is both rare and extremely important...

if 'problem X needs more talent' you should be able to hire people to consider problem X, subsidize training people to build skills to address X, fund prizes for solutions to X, etc.

If the issue is 'the problems are not defined well enough', you also should be able to fund people to target these problems, maybe fund people to refocus their research on these problems.

My fear is that the 'ETG is not important' is coming from a sort of drop-in-the-ocean fallacy ("there's already $1 billion going into X, so my $10,000 can't make a difference")

I also think that some of the critiques about "we don't know what to do next in X-risk/S-risk that isn't being funded" probably also apply to direct work. If we don't know what to do/fund, then how do we know that an additional EA skilling/focusing on this stuff will have a major impact?

Earning to Give isn’t dead, even for longtermism. There’s lots of value in having small amounts of seed funding available to help get projects off the ground.

But there already is from the major funders.

It might be the combination of small funding and local knowledge about people's skills that is valuable. For example, funding a person that is (currently) not impressive to grantmakers but impressive if you know them and their career plans deeply.

I bet that if they are impressive to you (and your judgment is reasonable), you can convince grantmakers at present.

Downvoting because I'd like to see more justification for a claim like "Earning to Give is dead". I'm totally down for bold proclamations, even when they go against the norm, but because you don't really back it up here it feels sloppy and really detracts from an otherwise interesting and useful post, making me much less likely to want to send it to anyone. 

"Earning to Give is dead" would need more justification if this post had been written today as opposed to how much it needed when written over a year ago.

I honestly think it would need it at either time point.  This is largely due to my thoughts that, even if you crunched the numbers and convincingly showed that ETG a year ago wasn't impactful, I still think "ETG is dead" stakes a claim to what "doing the most good" is that would needlessly lead other people astray of EA who wouldn't be open to changing their career path so drastically in response to changed circumstances. 

This is also hindsight bias, but seeing as we're in the state we are now, a post that declares "ETG is dead" without considering the sustainability of funds over time would seem to be indicative of a general lack of deep consideration into the issue, something I'd hope for for any claims as bold as that. 

Don't want to be the pedantic "tie up every loose end and make sure to justify every little thing before you post" person, but also think this was a large enough part to the post generally that posting without that reasoning just doesn't seem like a great idea. 

That's a fair position -- but I thought the hindsight effect is strong enough that a responsive comment pointing out the context in which the original poster decided not to provide more justification should be made.

Yeah fair enough, understood in that spirit I think that makes a lot of sense

This is a topic I love talking about, and since you're a professional lesswronger, I'll try to comment in the spirit of making predictions about random people who will read my comment :)

Here goes:

Are you a software developer reading this?

My prior is that you already want to build skill - but you aren't so sure how, and this is your bottle neck.

If you're not yet working professionally: consider reading this.

If you are working professionally but think you could maybe grow faster: I probably agree :) consider helping me out review some draft articles about how to do that. I offer video calls too if the articles won't be enough.

I agree with Ruby:

  1. Working on "hard" things (as he defined them) is great. I love the distinction between "waking up at 5am" and "running a startup"!
  2. Mentoring is great (but I predict you're having a hard time finding good mentoring)
  3. Working in a company that knows what it's doing is great (but I predict (with lower certainty) you're having a hard time figuring out if the company you're considering is actually great or if this is just something people are selling you)

Instead of "Earn to Skill", how about "Earn to Learn"?

I think Ben West coined it :)

How did my predictions go?

Let me know!

And hey Ruby :)

Earn to Learn

Well dang.

A couple of thoughts in response.

First: yes, I strongly agree that the E world would be a better place if young EAs gained some maturity and general professional skills before trying to change the world.

Second: My guess is that the companies who are competent and at which young EAs would learn valuable skills has a lot of overlap with the list of companies who are large/famous/well-paying.

Third: The rough impression I have is that it is very uncommon to be able to identify how good of a working environment a particular role will be before you are actually working in the environment. Job descriptions, interviews, and websites often to not give an honest description of the pluses and minuses of a role.

Thus, if I were a recent college grad, I'm not sure what action I would take based on this pose other than applying to large/famous/well-paying companies (which I would do anyway). Maybe I would slightly shift my priority away from "have impact now by applying to OpenPhil" and I would instead focus a bit more on "work for a Fortune 500 company to develop my skills for a few years, then shift into direct impact." Am I understanding that roughly correctly?

I think Ruby is implicitly saying to prefer successful-seeming startups to big companies.

"Earn to learn" is pretty great because it rhymes, but it doesn't convey the concept IMO. When I first read the "Earning to" part I assumed it was about earning to put aside money to buy time to study or courses or something.

More descriptive: Something like "Working to Skillbuild" maybe?

More from Ruby
Curated and popular this week
Relevant opportunities