EA thinking is thinking on the margin. When EAs prioritise causes, they are prioritising causes given the fact that they only control their one career, or, sometimes, given that they have some influence over a community of a few thousand people, and the distribution of some millions or billions of dollars. 

Some critiques of EA act as if statements about cause prioritisation are absolute rather than relative. I.e. that EAs are saying that literally everyone should be working on AI Safety, or, the flipside, that EAs are saying that no one should be working on [insert a problem which is pressing, but not among the most urgent to commit the next million dollars to]. 

In conversations that sound like this, I've often turned to the idea that if EAs controlled all the resources in the world, career advisors at the hypothetical world government's version of 80,000 Hours would be advising some people to be... postal workers. Given that the EA world government will have long ago filled the current areas of direct EA work, it could be the single most impactful thing a person could do with their skillset, given the comparative neglectedness of work in the postal service.

In this world some people would also be told that the best thing they could do is to work on [insert a problem which is pressing, but not among the most urgent to commit the next million dollars to in our current world]. 

It's basically just a fun thought experiment to make the point that EAs are not advising the whole world's resources, and if they were, they wouldn't (and shouldn't) argue for neglecting everything except for the current top EA causes. 

23

4
0
1

Reactions

4
0
1
Comments
No comments on this post yet.
Be the first to respond.
Curated and popular this week
Relevant opportunities