I've been involved in EA for nearly a year now. At first, it was super exciting. I resonated so much with the core ideas of EA, and I couldn't wait to get started with doing the most good I possibly could. I had no idea there was so much opportunity.
As I got further into it, my hopes started to fade, and I started to feel like I didn't really fit in. EA is pitched to the super intelligent in our society, those who did super hard degrees at Oxford or Harvard and learned to code at age 8. For me, I'm just average. I never stood out at school, I went to mid-ranking university and studied sociology (which has a reputation for being an easy degree). I graduated, got an average job and am living an average life. I don't have some high earning side hustle and I don't spend my spare time researching how we can make sure AI is aligned with human values.
I do however, care a lot about doing the most good. So I really want to fit in here because that matters a lot to me. I want to leave the world a better place. But I feel like I don't fit, because frankly, I'm not smart enough. (I'm not trying to be self deprecating here, I feel like I'm probably pretty average among the general population - and I didn't really ever feel 'not smart enough' before getting involved in EA)
I totally understand why EA aims at the Oxford and Harvard graduates, of course, we want the most intelligent people working on the world's most pressing problems.
But most people aren't Oxford or Harvard graduates. Most people aren't even university graduates. So do we have a place in EA?
I want to be a part of this community, so I'm trying to make it work. But this leads me to be worried about a lot of other people like me who feel the same. They come across EA, get excited, only to find out that there's not really a place for them - and then they lose interest in the community. Even the idea of giving 10% of your salary can be hard to achieve if you're balancing the needs/wants of others in your family (who maybe aren't so EA minded) and considering the rises in the cost of living currently.
I'm guessing here, because I have absolutely no stats to back this up and it's based on mostly my anecdotal experience - but we could potentially be losing a lot of people who want to be a part of this but struggle to be because EA is so narrowly targeted.
Whenever I come on the EA forum I literally feel like my brain is going to explode with some of the stuff that is posted on here, I just don't understand it. And I'm not saying that this stuff shouldn't be posted because not everyone can comprehend it. These are really important topics and of course we need smart people talking about it. But maybe we need to be aware that it can also be quite alienating to the average person who just wants to do good.
I don't have a solution to all this, but it's been on my mind for a while now. I re-watched this Intro to EA by Ajeya Cotra this morning, and it really re-invigorated my excitement about EA, so I thought I'd put this out there.
I'd be really keen to hear if anyone has any thoughts/feelings/ideas on this - I'm honestly not sure if I'm the only one who feels like this.
Thanks 😊.
Yeah, I've noticed that this is a big conversation right now.
My personal take
EA ideas are nuanced and ideas do/should move quickly as the world changes and our information about it changes too. It is hard to move quickly with a very large group of people.
However, the core bit of effective altruism, something like "help others as much as we can and change our minds when we're given a good reason to", does seem like an idea that has room for a much wider ecosystem than we have.
I'm personally hopeful we'll get better at striking a balance.
I think it might be possible to both have a small group that is highly connected and dedicated (who maybe can move quickly) whilst also having more much adjacent people and groups that feel part of our wider team.
Multiple groups co-existing means we can broadly be more inclusive, with communities that accommodate a very wide range of caring and curious people, where everyone who cares about the effective altruism project can feel they belong and can add value.
At the same time, we can maybe still get the advantages of a smaller group, because smaller groups still exist too.
More elaboration (because I overthink everything 🤣)
Organisations like GWWC do wonders for creating a version of effective altruism that is more accessible that is distinct from the vibe of, say, the academic field of "global priorities research".
I think it is probably worth it on the margin to invest a little more effort into the people that are sympathetic to the core effective altruism idea, but maybe might, for whatever reason, not find a full sense of meaning and belonging within the smaller group of people who are more intense and more weird.
I also think it might be helpful to put a tonne of thought into what community builders are supposed to be optimizing for. Exactly what that thing is, I'm not sure, but I feel like it hasn't quite been nailed just yet and lots of people are trying to move us closer to this from different sides.
Some people seem to be pushing for things like less jargon and more inclusivity. Others are pointing out that there is a trade-off here because we do want some people to be thinking outside the Overton Window. The community also seems quite capacity constrained and high-fidelity communication takes so much time and effort.
If we're trying to talk to 20 people for one hour, we're not spending 20 hours talking to just one incredibly curious person who has plenty of reasonable objections and, therefore, need someone, or several people, to explore the various nuances with them (like people did with me, possibly mistakenly 😛, when I first became interested in effective altruism and I'm so incredibly grateful they did). If we're spending 20 hours having in-depth conversations with one person, that means we're not having in-depth conversations with someone else. These trade-offs sadly exist whether or not we are consciously aware of them.
I think there are some things we can do that are big wins at low cost though, like just being nice to anyone who is curious about this "effective altruism" thing (even if we don't spend 20 hours with everyone, we can usually spend 5 minutes just saying hello and making people who care feel welcome and that them showing up is valued, because imo, it should definitely be valued!).
Personally, I hope there will be more groups that are about effective altruism ideas where more people can feel like they truly belong. These wider groups would maybe be a little bit distinct from the smaller group(s) of people who are willing to be really weird and move really fast and give up everything for the effective altruism project. However, maybe everyone, despite having their own little sub-communities, still sees each other as wider allies without needing to be under one single banner.
Basically, I feel like the core thrust of effective altruism (helping others more effectively using reason and evidence to form views) could fit a lot more people. I feel like it's good to have more tightly knit groups who have a more specific purpose (like trying to push the frontiers of doing as much good as possible in possibly less legible ways to a large audience).
I am hopeful these two types of communities can co-exist. I personally suspect that finding ways for these two groups of people to cooperate and feel like they are on the same team could be quite good for helping us achieve our common goal of helping others better (and I think posts like this one and its response do wonders for all sorts of different people to remind us we are, in fact, all in it together, and that we can find little pockets for everyone who cares deeply to help us all help others more).