Each one of us only has a single perspective and it’s human nature to assume other people have similar perspectives. EA is a bubble and there are certainly bubbles within the bubble, e.g. I understand Bay Area is very AI focused while London is more plural.
Articles like this that attempt to replace one person’s perspective with hard data are really useful. Thank you.
At EA for Christians, we often interact with people who are altruistic and focused on impact but do not want to associate with EA because of its perceived anti-religion ethos.
On the flip side, since becoming involved with EA for Christians, a number of people have told me they are Christian but keep it quiet for fear it will damage their career prospects.
We should all try to maximise our impact and there’s a good argument for specialisation.
However, I’m concerned by a few things:
Highly engaged EAs were much more likely to select research (25.0% vs 15.1%) and much less likely to select earning to give (5.7% vs 15.7%)
are you sure this isn’t just a function of the definition of highly engaged?
The Parable of the Good Samaritan seems to lean towards impartiality. Although the injured man was laying in front of the Samaritan (geographic proximity), the Samaritan was considered a foreigner / enemy (no proximity of relationship).
Did the EV US Board consider running an open recruitment process and inviting applications from people outside of their immediate circle? If so, why did it decide against?
Thanks, Ben. This is a really thoughtful post.
I wondered if you had any update on the blurring between EA and longtermism. I‘ve seen a lot of criticism of EA that is really just low quality criticism of longtermism because the conclusions can be weird.
You raise some good points, so I have removed that point from the main article.