Saw this post for the first time after it was linked from one of the recent FTX posts, and wanted to say thank you for having taken the time to write and express these concerns, which clearly weren't very popular but turned out to be...prescient. I'm a bit frustrated this didn't get more karma or engagement at the time.
I'm also frustrated that I probably just scrolled past without clicking or considering because it didn't have that much karma and seemed 'against the mood.' It feels important for everyone (like me) who was caught off guard this week to recognize that this was not, actually, unforeseeable. It's humbling to realize how much work our cognitive biases must have been doing . Anyway, thanks!
Strongly upvoted, and think this is a great post with advice I hope people take seriously.
A minor critique of this part - "but in practice, they’re usually solving ops bottlenecks at the cost of building harder-to-acquire skills" -- I worry people will take this to mean that solving ops bottlenecks is something easily done by most young EAs without much career capital. This hasn't been my experience. I think solving ops bottlenecks effectively is really hard, and is in fact one of the things I wish more young EAs would build skills in doing by going into work outside of EA with lots of mentorship and feedback loops.
I definitely felt dumb when I first encountered EA . Certain kinds of intelligence are particularly overrepresented and valorized in EA (e.g. quantitative/rational/analytical intelligence) and those are the kinds I've always felt weakest in (e.g. I failed high school physics, stopped taking math as quickly as I could). When I first started out working in EA I felt a lot of panic about being found out for being secretly dumb because I couldn't keep up with discussions that leaned on those kinds of intelligence. I feel a lot better about this now, though it still haunts me sometimes.
What's changed since then?
"It is also similarly the case that EA's should not support policy groups without clear rationale, express aims and an understanding that sponsorship can come with the reasonable assumption from general public, journalists, or future or current members, that EA is endorsing particular political views."
"Other mission statements are politically motivated to a degree which is simply unacceptable for a group receiving major funds from an EA org."
I think the point has been made in a few places that more money means lower barrier to entry and is an opportunity to reduce elitism in EA and I just wanted to add some nuance:
Thank you for writing this post; I know these take a lot of time and I think this was a really valuable contribution to the discourse/resonated strongly with me.
I find it helpful get clearer about who the audience is in any given circumstance, what they most want/value and how money might help/hurt in reaching them. When you have a lot of money, it's tempting to use it as an incentive without noticing it's not what your audience actually most values. (And creates the danger of attracting the audience that does most value money, which we obviously don't want.)
For example, I think two critical audiences include both 'talented people who really want to do good' and 'talented people who mostly just like solving hard problems.' We're competing with different kinds of entities/institutions in each case, but I think money is rarely the operative thing for the individuals I'd most like to see involved in EA.