A

AustinZ

47 karmaJoined Mar 2021

Posts
1

Sorted by New

Comments
1

Many of these concerns resonated with me.

A relative outsider, my understanding of EA formed around its online content, which emphasises utilitarianism and longtermism. Whenever speaking to EA's in person, I'm often surprised that these perspectives are more weakly held by community members (and leaders?) than I expected. I think there are messaging issues here. Part of the issue might be that longtermist causes are more interesting to write and talk about. We should be careful to allocate attention to  cause areas proportional to their significance.

Too much of the ecosystem feels dependent on a few grantmakers / re-granters. It concentrates too much power in relatively few people's hands. (At the same time, this seems to be a very hard problem to solve. No particular initiatives come to my mind.)

I see EA's concerns with reputational risk and optics as flaws with its overly utilitarian perspective. Manipulating the narrative has short-term reputational benefits and hidden long-term costs.

At the same time, I am sceptical of EA's ability to adequately address these issues. Such concerns have been previously raised without significant change. It feels like many of these issues have arisen due to the centralisation of power and the over-weighting of community leaders' opinions, yet simultaneously the community is sufficiently de-centralised that it's difficult to coordinate such a change.