[epistemic status: articulation of a position I kind of believe and think is under-articulated, but am unsure of the strength of]
I think EA has a lot of great ideas. I wish more people in the world deeply understood them, and took ~EA principles seriously. I'm very into people studying the bodies of knowledge that EA has produced, and finding friends and mentors in the ecosystem.
But I also think that EA is still a tiny corner of the world, and that there's a lot of important networks and knowledge beyond it. When I think about optimal allocation of people who are bought into EA, I want quite a lot of those people to go out and interact with different systems in the world, different peer groups; and learn from them, make connections.
In principle this should be pretty accessible. Except I worry about our implicit social structures sending the message "all the cool people hang around the centrally EA spaces" in a way that doesn't really support people to actually go and do these exploring moves while being engaged in and encouraged by EA.
I think that this is one of the (if not the) most important problems to fix in EA messaging / status-granting.[1] Note that I don't think we want to slow down people coming in to the EA bubble -- I think it's often healthy and good for people to get up to speed on a lot of stuff, to give them better context for subsequent decisions. So the challenge is to encourage people to graduate to exploring without making exploring itself so high-status that people jump directly there without learning the cool stuff that EA has to offer first.
What could we do about it? Some options:
- Encourage a narrative something like "when your EA learning slows down, that's often the time to dive back into the wider world"
- Celebrate people who follow this trajectory
- Make sure that community support structures are helpful and functional for people who have a lot of EA knowledge but are now exploring rather than "full time EA professionals"
I'd be keen to see fleshed out versions of these, or other ideas.
Absent good fixes here, I'm inclined to celebrate a certain amount of EA disillusionment: it seems important that a fraction of super talented people go and explore different areas, and if that's easier to access given disillusionment with EA then so much the worse for people's good opinions of EA. But this seems worse if something else could work, because of bad feeling, and making it harder for people to stop exploring mode and start working with the core of the community when that's correct.
N.B. I'm making a directional claim here. Of course it's quite possible to imagine getting to a stage where too many people go and explore, evaporating the pool of people trying to work on the most crucial things. What would be too much exploration? My guess is that in equilibrium the ideal might be between 10% and 20% of the people who are sufficiently skilled up to do really important work in the core should be exploring instead. And a larger group around them who can't yet find crucial work in the core (but hope to some day) should also do this. But I don't put that much stock in my numbers; I'm interested in takes from people who would go higher or lower.
- ^
Another candidate: wanting people who can think for themselves, but granting social status to people who appear to come to the same conclusions as leadership.
I think if we want people to leave EA build skills and experience and come back and share those with the community the community could do a better job at listening to those skills and experience. I wanted to share my story in case useful:
– –
My experience is of going away, learning a bunch of new things, coming back and saying hey here are some new things but mostly people seem to say that’s nice and keep on doing the old things.
As a concrete example, as one thing among many, I ended up going and talking to people who work in corporate risk management and national risk management and counterterrorism. And I find out that the non-EA expert community worry about putting too much weight on probability estimates over other ways of judging risks and I come back and say things like: hey are we focusing too much of forecasts and using probabilistic risk management tools rather than more up-to-date best practice management tools.
And then what.
I do of course post online and talk to people. But it is hard to tell what this achieves. There are minimal feedback loops and EA organisations don’t have sufficient transparency of their plans for me to tell if my efforts amount to anything. Maybe it was all find all along and no one was making these kinds of mistakes, or maybe I said "hey there is a better way here" and everyone changed what they are doing or maybe the non-EA experts are all wrong and EAs know better than there is a good reason to think this.
I don’t know but I don’t see much change.
– –
Now of course this is super hard!!
Identifying useful input is heard. It is hard to tell apart a "hey I am new to the community and don’t understand important cruxes and think thing x is wrong and am not actually saying anything particularly new" from "hey I left the community for 10 years but have a decent grasp of key cruxes and have a very good reason why the community gets thing x wrong and it is super valuable to listen to me".
It can even be hard for the person saying these things to know what category they fall in. I don’t know if my experiences should suggest a radical shift in how EAs think or are already well known.
And communication is hard here. People who have left the community for a while wont be fully up-to-date with everything or have deep connections or know how to speak the EA-speak.
– –
So if we value people leaving and learning then we should as a community make an effort to value them on this return. I like your ideas. I think celebrating such people and improving community support structures needs to happen. I am not sure how best to do this. Maybe a red-team org that works with people returning to the community to asses and spread their expertise. Maybe a prize for people bringing back such experience. I also think much more transparency about organisations theories of change and strategy would help people at least get a sense of how organisations work and what if anything is changing.