[author note: this is a very brief overview of a paper I'm writing and I'm curious about others' thoughts on this topic and maybe some feedback with gaps in my thinking or alternative perspectives].

EA seems to cater to individuals who already have a stronger disposition to values-based living and/or strong sense of character. Altruism is considered a virtuous pursuit and therefore one can argue people with a higher sense of morality and values are more inclined to engage with the movement. The question then is how to connect with individuals who are less values-driven? [Assuming the goal is to continue growing, which is typically the goal of any movement. If EA doesn't aim to extend its reach, then it's more a community than a movement].

It’s interesting to note that EA skews younger which contradicts the above notion - a stronger adoption of values and ethics tends to be more commonplace among older adults. Older generations had greater externally-motivated/socially-driven incentives to at least passively engage in values-based living - religion, arguably one of the greatest drivers of values and morality systems, was at the center of most communities and socialization. Religious affiliation has been declining for years - not to say younger generations aren’t spiritual - but it does suggest less desire to connect over shared beliefs in terms of views on the afterlife, creation and value systems. If younger generations are less connected (or interested) by values, how can the EA movement grow beyond word of mouth? [It would also be interesting to note how is EA working to connect with individuals from older generations in general...].

I believe one of the reasons EA has so far skewed young is because it was founded in 2011 by a group of young, new grads. In the early stages of any group, the newest members often closely resemble the demographics of the founding team given initial growth usually occurs through word of mouth and friends. This still seems to be a core method of attracting new members but this is limiting in the long-run - if you have an inclination toward stronger values, than you likely associate with people who do as well. So word of mouth doesn't work as well outside social circles because the person you are talking to about EA may not share the same inclination toward values and ethics.

One method is to connect to an individual's interests through cause priorities. This is arguably already in practice - most people in EA seem to identify with a specific area of personal/professional focus be it through direct work or donating. However, this could still be limiting given there are only 3-5 core priorities at the center of conversations at any given time. If these priorities genuinely don’t interest someone is there still an opportunity for them to connect with EA? 

 Another strategy is to bypass the moral/ethical implications and connect people with information they want or need - essentially getting people to passively engage with the movement without active identification (i.e. someone getting a job through the 80K Hours Board but doesn’t associate with EA). Is this a win for EA? Is there something to be gained in maintaining a small community of committed members while expanding the reach to engage more people in the goals of EA without requiring intentional participation? Do people need to value the philosophy behind EA in order to participate in the pursuit of the goals of the EA movement? 





More posts like this

Sorted by Click to highlight new comments since:

Others have said this, but you're getting at whether the movement should prioritize growth and easy assimiliation or maintaining high fidelity to its values. So far most of the core favors the high fidelity model. Personally, I agree, because EA won't be as effective or could even be destructive if EA as a movement is not anchored in its values. But we miss out on people who don't have that somewhat extreme, values-driven bent, which is a terrible loss for EA as a community.

Even at the level of organizing at Harvard, I feel torn between seeing our club's value as spreading some good values on campus (more watered down outreach) or incubating the next generation of high-power EAs (a few intense, targeted waves of outreach). I worry that we unintentionally select for a lot of baggage when we select the intense, highly values-driven people, and that the more the entire movement does that, the more blind we are to it.

Seminal for me has been Owen Cotton-Barratt’s paper “How valuable is movement growth?” I therefore welcome the shift toward very careful if any growth that has happened over the past years. Today I think of the EA community like a startup of sorts that tries to hire slowly and selects staff carefully based on culture fit, character, commitment, etc.

You're bringing up a lot of questions that are core to the EA movement, and which have been debated in many different places. The links from CEA's strategy page might interest you; they go into CEA's models of how to build communities, and where "impact" comes from.

In general, there's no simple answer to how much a person's personal values matter for their potential impact. To give a simplistic example, value alignment with EA seems more important for a moral philosopher (whose work is all about their values) than for a biologist (if someone decides to work on anti-aging research because they want to win a Nobel Prize and think Aubrey de Grey has a cool beard, they may still do excellent, world-shaping work despite non-EA motives).

You may want to check your intuition that older generations are more value-driven against data; older people tend to be more religious, but younger people tend to give "better" answers on many important moral questions (look up "the expanding moral circle" for more on this idea). Meanwhile, the extent to which people make sacrifices to act on their values seems to fluctuate from generation to generation; political protests go from popular to unpopular to popular again, people worry less about pollution but more about eating meat, etc.

Thanks to modern communication systems and growing moral cosmopolitanism throughout the world, this is probably the best time in history to promote something like EA, and conditions are getting better every year.

This seems connected to a perennial question in EA: should organizations be means-focused or ends-focused. By that I mean should an EA-aligned org focus primarily on methods or primarily on outcomes. For example, when it comes to community building, an ends-focused approach would suggest we should grow as large as possible and get as many people as possible to give effectively, even if you have to lie to them to do it. A means-focused approach to community building would look more like what we have now, where there is a heavy focus on keeping EA true to its values even if it comes at a cost of convincing some people to give money effectively that could be had by methods that go against EA values like careful epistemics.

So far it seems EA orgs have decided to be primarily means-focused and accept giving up some of the gains possible via an ends-focus since it would risk diluting EA values and missions, and folks in the community have been pretty vocal when they feel orgs list too close to become ends-focused if they compromise too much on holding to EA values. I don't know if that will continue in the future or if everyone in EA is on board with such a choice, but it's at least what I've observed happening. Given that mean EAs are consequentialists I expect we'll always see some version of this conversation happening so long as EA exists.

Curated and popular this week
Relevant opportunities