I took a minute to think about what sort of org has a natural distinction between "core work" and "non-core-work".
A non-EA example would be a Uni research lab. There are usually a clear distinction between
Where the role of admin seems similar to EA ops.
This reminds me of attitudes to Quantum Physics. Most current physics professors I've meat have a sort of learned helplessness relationship to quantum interpretations, subscribing to something like "shut up and calculate" (i.e. don't even try to understand). There is an attitude that quantum is too strange and therefore impossible to understand. Where as the newer generation of post-docs and grand students don't shy away from quantum interpretations, and discussions of ontology. However, this falls a bit outside your model, since quantum mechanics is ~100 year old.
EA clearly don't know how to handle power dynamics, and until we figure this out, we should avoid (as much as possible) to create concentration of power. I say this in full knowledge that avoiding concentration of power is not without cost.
Some examples of broken power dynamics:
What to do:
I'm not accusing specific people of specific things. My current best model is that everything we see is what naturally happens when power is centralised. This is not about specific people, this is systemic. For example it's not the fault of the central orgs that too many people defer to them too much, that's on the rest of us.
I'm also not saying that no specific person is blameworthy. I'm just not getting into that discussion at all.
Party what made me think of this was talking to you about ALLFED in Gothenbug in 2018.
I remember a few years ago, there seemed to be a small but growing interest in systems change in EA. I found the facebook group, but it's mostly dead now. Scrolling back it seems like it was taken over by memetic warfare rather than discussion sadly.
Effective Altruism: System Change | Facebook
Maybe the real reason EA has not been able to have an ongoing systems change discussion is because this is always how it ends?
Related:
Politics is the Mind-Killer - LessWrong
LW is sort of like a sister community to EA, with lots of overlap in membership and influence going both ways. I believe that that the above post is part of the founders effect that has kept EA away from politics, but I also think the argument are not wrong.
Ooo look!
Someone already said all the things I wanted to say, except even better. This is great. I feel instantly less annoyed. Thanks :)
I found your systems change post
A Newcomer's Critique of EA - Underprioritizing Systems Change? - EA Forum (effectivealtruism.org)
I have to admit I only read the title and a few sentences here and there. But you are right that EAs are not much into systems change. Part of this is founders effect. But I also believe part of it is because of a misuse of the neglectedness framework. Systems change is basically politics, which is not a neglected area. But my prior is that there are lots of neglected interventions.
For example, this is super cool:
Audrey Tang on what we can learn from Taiwan’s experiments with how to do democracy - 80,000 Hours (80000hours.org)
Welcome to EA Sam!
I actually don't know, I've never done that type of research either. I mostly thing about AI risk.
But I did scroll though the list of EA Forum tags for you and found these:
Research - EA Forum (effectivealtruism.org)
Global priorities research - EA Forum (effectivealtruism.org)
Independent research - EA Forum (effectivealtruism.org)
Research methods - EA Forum (effectivealtruism.org)
Research training programs - EA Forum (effectivealtruism.org)
Maybe there's something helpful in there?
[Epistemic status: Mostly a ranting, but I'm also open to the possibility that I'm missing something important about how other people communicate.]
When people write "we should..." about EA. What does this mean exactly?
When ever I read this phrase I'm getting the sense that this person is confused about how EA works. But maybe I'm the one who is missing something.
There are groups that is all about coordinated actions, and for them it does make sense to make suggestions of the form "we should...". But EA is not like that. Coordinated action is not our thing, and I don't think it should be. We are not that type of movement.
I love that EA is to a large extent a do-occrasy. The way to get something done in a do-occrasy is not to suggest that "we should" but to say, "I'm starting this project. Anyone want's to join?".
The main exception where we're not do-occrasy is the funding, and everything down stream from that, which is admittedly a large part of EA. If you want funding for your project you do need to convince others to donate to you. But I don't think the path to getting funding is to write "we should..." on EA forum? Maybe I'm wrong in which case please tell me!
To avoid potential misunderstanding: I'm not any normative claims about funding in this short form!
I'm more understanding of people who write "someone should...". I've done that. It usually doesn't work, but at least I can see where that would come from.
Can you give some examples for what these low quality goods are?
(I notice this is an old post, but I read it for the first time today.)