Hide table of contents

Epistemic status: Personal experience and anecdotes have led me to this bundle of weakly-held views, and I’m wondering what other people think. My experience is heavily biased towards the existential risk subculture of EA, so it’s possible that my criticism applies much less (or not at all) outside of that subculture.

I think the EA community is too reliant on personal connections in a number of ways. I think that many individuals and organizations would benefit (by their own lights) from taking steps to be less reliant on personal connections, and that such steps would also increase overall impact by supporting better ideas, increasing community health, and improving the community’s reputation among outsiders.

Evidence that EA is very reliant on personal connections

Many large donors (and donation advisors) do not take general applications. This includes Open Philanthropy (“In general, we expect to identify most giving opportunities via proactive searching and networking”), Longview, REG, CERR, CLR[1], and the new Longtermism Fund. The 80,000 Hours job board does not take applications, saying, “Due to resource constraints, we are currently unable to process unsolicited requests to list vacancies on this board.” (Edit: It looks like they recently started taking applications.)

Compared to other communities, I think EA organizations and grantees are more likely to get funding from one of a small group of funders. Historical EA funding data is dominated by Open Philanthropy, Givewell, and (starting in 2022) FTX. Effective Altruism Data suggests that Founders Pledge might also be in this small dominant group. I don’t know how to compare this to other communities or research fields so I can’t really back up my claim with data, but I’d be interested if anyone has thoughts on this.

Compared to other communities, EA grantees and grantors share a higher number of personal and professional connections. I can’t figure out how to operationalize this and don’t have any sources, but it seems obvious to me. How many physicists live in a group house with their NSF grant evaluator? How many physicists have a part-time job making grants to their colleagues? Both of these things are common among employees of organizations in the EA community.

As a counterexample to my claim about the lack of applications, it’s important to note that some large EA-affiliated funders do take applications. For example, EA Funds, SFF, CLR, ACE, and FTX (though only once, and they “do not currently have any plans to resume accepting applications and do not know if or when we will do so”). Also, Givewell takes applications “for organizations [to] apply for a recommendation without receiving an invitation from GiveWell.”

Finally, 80k’s job board includes lots of jobs at organizations that don’t normally consider themselves part of the EA community, and also recently(?) started taking applications. I see both of these facts as evidence against EA overreliance on personal connections.

Why is overreliance on personal connections a problem?

The more reliant we are on personal connections to achieve our altruistic goals, the more we entangle the EA community with the EA project. Having an EA community is good[citation needed], but the community is not identical to the project.[2] Communities enforce certain norms and styles (any community that does not do so loses out on many of the benefits of being a community), and so necessarily exclude people who don’t want to follow those norms or who dislike that style. Furthermore, identifying the community with the project risks making people feel like once they leave the community, they might as well stop pursuing the project as well. Put another way, “I don’t feel like I belong in this community, so I guess I’m not meant to do [job/project/skill set that EAs consider impactful].” Other people have written about this; one example is If you're unhappy, consider leaving.

Close connections discourage criticism. Criticism of EA Criticism Contest addresses this in a few ways, especially the comments on The Judging Panel. I would add that (1) it’s harder to notice problems with the views of people you know and like, and (2) once you notice the problems it’s harder to express them because of social pressure to not be a jerk. (NegativeNuno is an admirable attempt to fix (2) by explicitly presenting an entirely new persona. I think the account’s existence supports my point.)

Overreliance on personal connections contributes to the EA community’s existing lack of diversity in terms of:

The linked posts describe a huge variety of problems that are caused or exacerbated by this lack of diversity. There have also been posts discussing the strengths of diversity in general, for example: EA Diversity: Unpacking Pandora's Box, In diversity lies epistemic strength, and Why & How to Make Progress on Diversity & Inclusion in EA.

Overreliance on personal connections is not good preparation for scaling. At some point EA is going to be too big for personal connections to cover a significant portion of the movement, and leveraging personal connections will become less and less useful. As this happens, either (1) individuals relying on personal connections will become less and less effective, and/or (2) the community will fragment into subgroups small enough to be built on personal connections. I think both of these phenomena will be difficult to recognize from the individual’s point of view. Since the EA community is currently growing rapidly, we might already be experiencing these problems.

When we build our systems entirely out of personal trust, we are more vulnerable to having that trust taken advantage of. Several recent posts (I like this one) have addressed the potential for specifically the grantor-grantee trust relationship to be compromised by increasing the amount of money that is flowing. But grantor-grantee is not the only trust relationship that plays an important role in the EA community. Other important trust relationships include:

  • Employer-employee: If I hire someone with whom I have a first- or second-degree personal connection, I can more quickly trust them to share my values, especially when it comes to high-stakes decisions with slow feedback loops. If employers get used to this, and build this assumption into how their organization is run, bad things can happen when someone without those values joins the organization. We can either suffer the problems with scaling I addressed in the previous paragraph, or we can build organizations that are able to succeed without the assumption of personal trust.
  • Leader-member: I claim (with no evidence and based entirely on vibe) that, compared to leaders in other communities, people with high EA-community status are fewer degrees of separation away from the average EA community member. If I know [EA leader] personally, or know someone who knows [EA leader] personally, I am more likely to trust them to share my values when it comes to strategic decisions that affect my community (e.g. the Coordination Forum / Leaders Forum). Consequently the success and legitimacy of these leaders is overly reliant on personal connections. As the community grows, more people will lack personal connections with EA leaders. If we don’t have a system for establishing their legitimacy and authority that doesn’t rely so heavily on personal connections, the community will weaken as it grows.

Overreliance on personal connections can affect the EA community’s reputation in the outside world. People outside the movement look at EA, accurately see that it’s isolated and that they don’t know anyone in the community, and decide that EA must be flawed in some important way. I think “EA is a weird cult” is closely related to this. More on that in How EA is perceived is crucial to its future trajectory.

Solutions

For Grantmakers

Grantmakers should take more open applications for funding, and publicize those applications in more places and to more (non-EA) communities.

For Hiring Managers

Hiring managers should post jobs in more places, and be less dismissive of “non-EA” applicants. To be clear, it’s OK to value “belief in the mission” in candidates. All non-profits (and many for-profits) want this! I’m not saying that we should stop caring about whether candidates and employees understand and care about their organization’s mission. The mistake is assuming that the only people who understand and believe in my organization’s mission are members of the effective altruism community. EA ideas don’t have to come in a complete package. People can believe that one organization’s mission is really valuable and important, for different reasons, coming from totally different values, and without also believing that a bunch of other EA organizations are similarly valuable. A great example of this can be found at A subjective account of what it's like to join an EA-aligned org without previous EA knowledge.

And a personal example: When I started working at BERI, I was not part of the EA community, but I did identify with EA values. In retrospect, I’d say I was part of the EA project but not part of the EA community. Now I’m part of both.

For Community Builders[3]

As a community, we should try to build connections with other communities. Different segments of EA share values and interests with many other communities (e.g. development economics, public health, artificial intelligence, epidemiology, animal welfare) but I have a sense that EA ideas and efforts exist at the margins of each of those communities and are often not taken seriously. Community builders could actively work to bridge these social gaps. But instead of building community by wholly converting one person at a time into a card-carrying EA, build community one level up, by making connections between the EA community and other communities that partially overlap with us in terms of values and/or interests.

One more concrete idea here is to co-host events with other communities. And I don’t just mean EA Stanford co-hosting with EA Berkeley. Find communities that share some (but not all) of your values, interests, or goals, and co-host an event with them. Some good ideas at A Guide to Early Stage EA Group-Building at Liberal Arts Colleges.

For Individuals

You should have friends and professional connections outside of the EA community.[4] Most of the problems I’ve described above only appear when members of the community tend to have very similar networks. Leaning on personal connections is a natural human tendency; if we diversify our connections, we can mitigate some of the problems that it can cause.

It’s OK for you to take a break from EA or to leave the community altogether. If you think this might be happening to you now or might happen to you in the future, I recommend Leaning into EA Disillusionment and the abovementioned If you're unhappy, consider leaving.

Summary, confidence levels, and conclusion

In this post, I tried to show that the EA community’s overreliance on personal connections is a problem.

I started by giving some evidence that EA is very reliant on personal connections. Ultimately this is the part of the post I’m least confident about, mainly because I don’t know how to easily compare with other communities.

Next I addressed why I think overreliance on personal connections is a problem: it discourages criticism, decreases diversity, prevents scaling, harms the community’s reputation, makes us more vulnerable to being taken advantage of, and wrongly identifies the community with the project. This is the section I’m most confident about: There exists a level of reliance on personal connections that is harmful, and these are some of the harms. I don’t know exactly where that level is or if this community has exceeded it.

Finally I talked about some solutions: ways that grantmakers, hiring managers, community builders, and other individuals can decrease the reliance on personal connections by themselves and by the community as a whole. These actions seem good to me, but I’m pretty sure they’re not the best solutions, and most of them could have negative impacts (on metrics unrelated to this post) which should be considered before acting.

None of this is unique to EA. While I think EA is particularly guilty of some of these issues, in general I could aim this criticism in any direction and hit someone guilty of it. But “everyone else does it” is not in and of itself a reason to accept it. We claim to be doing something really difficult and important, so we should try to be as good as possible.

Also, I think personal connections are great and we should continue to use them in pursuit of altruistic goals. EA is more fun when you do it with friends, and there are tons of reasons to think that use of personal connections will make us more successful as a project and as a community. I just think maybe we’re too far in that direction and we could benefit from toning it down a bit.

Throughout the post, I linked to many other Forum posts. Please don’t feel like you have to read 20 other posts before engaging; you can definitely just tell me why you think I’m wrong without following the links. I included the links because many parts of this post have been written in different ways elsewhere, and I wanted to recognize and point out those contributions.

This post is based heavily on personal experience, so I’m very interested in hearing about your experience with these topics. Thanks for reading.

Edited 2022-09-02: I originally include CLR on my list of grantmakers who do not take applications. This was incorrect: CLR takes general applications to the CLR Fund and would love to get more of them.

Edited 2022-09-09: I originally said that the 80,000 Job Board does not take applications for new job listings. This appears to have changed.

Thanks to Kyle Scott and Sofia Davis-Fogel for helpful discussion and review.

  1. ^

    Per Chi's comment, CLR takes general applications to the CLR Fund and would love to get more of them! This was a mistake on my part.

  2. ^

    And I think that claiming these are identical is close to claiming that everything EAs do is more effective than anything non-EAs do, which I think is very wrong.

  3. ^

    Keeping in mind the excellent points from Let's not have a separate "community building" camp

  4. ^

    This is important for other reasons too.

Comments27
Sorted by Click to highlight new comments since: Today at 12:39 PM

Thanks for this post. It fits my experience and reasoning. I suspect that people don't see it as a major problem, yet, because we are still used to think of EA as a "small movement". I suspect this is one of the usual issues in the long-term growth of a movement, either becoming an obstacle for scaling, or threatening the "fidelity" to core principles.

Chi
2y19
0
0

Many large donors (and donation advisors) do not take general applications. This includes Open Philanthropy (“In general, we expect to identify most giving opportunities via proactive searching and networking”), Longview, REG, CERR, CLR, and the new Longtermism Fund.

Grant manager at CLR here - we take general applications to the CLR Fund and would love to get more of them. Note that our grantmaking is specifically s-risk focused.*

Copy pasting another comment of mine from another post over here:

If you or someone you know are seeking funding to reduce s-risk, please send me a message. If it's for a smaller amount, you can also apply directly to CLR Fund. This is true even if you want funding for a very different type of project than what we've funded in the past.

I work for CLR on s-risk community building and on our CLR Fund, which mostly does small-scale grantmaking, but I might also be able to make large-scale funding for s-risk projects ~in the tens of $ millions (per project) happen. And if you have something more ambitious than that, I'm also always keen to hear it :)

 

 

*We also fund things that aren't specifically targeted towards s-risk reduction but still seem beneficial to s-risk reduction. Some of our grants this year that we haven't published yet are such grants. That said, we are often not in the best position to evaluate applications that aren't focused on s-risk even if they would have some s-risk-reducing side effects, especially when these side effects are not clearly spelled out in the application.

Thanks Chi, this was definitely a mistake on my part and I will edit the post. I do think that your website's "Get Involved" -> "CLR Fund" might not be the clearest path for people looking for funding, but I also think I should have spent more time looking.

The fact that EA is more legible than other philanthropy makes this non-obvious, but  I think that EA is less reliant on personal connections or inside connections with powerful people than most other major philanthropy or class of philanthropy.  Academic funders are similar, but in place of personal connections they are about scholarly qualifications mostly at elite institutions, and the implicit or explicit training in how to apply to those grants and the ways to appeal to the individuals who evaluate them - which is a different but also problematic failure mode.

In contrast, there is much more openness to non-insiders to build connections, much more openness to feedback, etc. In fact, posting useful and important things to the EA forum is often enough to get attention and potentially get funding. But I agree that people used to other types of credentialing  probably sees this as much more exclusionary, albeit mostly based on lack of looking at the community before applying for funds.

Of course, this is based on my personal experience with other philanthropies and funders, and may not generalize. I would be interested in hearing from others who have gotten grants from other foundations about whether they are more or less closed to outsiders, and/or whether the qualifications are more or less difficult to know without inside information and connections to others.

Great points, thanks David. I especially like the compare and contrast between personal connections and academic credentials. I think probably you're more experienced with academia and non-EA philanthropy than I am, so your empirical views are different. But I also think that even if EA is better than these other communities, we should still be thinking about (1) keeping it that way, and (2) maybe getting even less reliant. This is part of what I was saying with:

None of this is unique to EA. While I think EA is particularly guilty of some of these issues, in general I could aim this criticism in any direction and hit someone guilty of it. But “everyone else does it” is not in and of itself a reason to accept it. We claim to be doing something really difficult and important, so we should try to be as good as possible.

I think your observations may be counterevidence to anyone saying that EA should become more reliant on personal connections. Since you think (possibly correctly) that other major philanthropy is more reliant on personal connections than EA is, and I assume we agree that EA philanthropy is better than most other major philanthropy.

I like some of the ideas:

  • Cohosting events with other movements
  • Seeding groups in more countries
  • Projects to bring in skills that EA currently lacks (ie. EA Communications Fellowship, writer's retreat, ect.)

On the other hand:

  • I think that the author undervalues value alignment and how the natural state is towards one of regression to the norm unless specific action is taken to avoid this
  • I agree that as EA scales, we will be less able to rely personal relationships, but I see no reason to impose those costs now
  • I agree that it may affect our reputation in the outside world, but I don't think it's worth increasing the risk of bad hires to attempt to satisfy our critics.

I'm worried about tensions EA being both a social and professional community entails, but I don't have a good solution to this and maybe the status quo is the least bad option?

Thanks for the thoughtful feedback Chris!

I think that the author undervalues value alignment and how the natural state is towards one of regression to the norm unless specific action is taken to avoid this

I think there is  difference between "value alignment" and "personal connection". I agree that the former is important, and I think the latter is often used (mostly successfully) as a tool to encourage the former. I addressed one aspect of this in the Hiring Managers section.

I agree that as EA scales, we will be less able to rely personal relationships, but I see no reason to impose those costs now

Fair, but I worry that if we're not prepared for this then the costs will be greater, more sudden, and confusing, e.g. people starting to feel that EA is no longer fun or good and not knowing why. I think it's good to be thinking about these things and make the tactical choice to do nothing, rather than leaving "overreliance on personal connections can be bad" out of our strategic arsenal completely.

I agree that it may affect our reputation in the outside world, but I don't think it's worth increasing the risk of bad hires to attempt to satisfy our critics.

I don't think my suggestions for hiring managers would increase the risk of bad hires. In fact, I think moving away from "my friend is friends with this person" and towards "this person demonstrates that they care deeply about this mission" would decrease the risk of bad hires. (Sorry if this doesn't make sense, but I don't want to go on for too long in a comment.)

moving away from "my friend is friends with this person"

I hadn't thought of your post in these explicit terms till now, but now that you write it like that I remember that indeed I've already applied to a program which explicitly asked for a reference the head organizer knows personally.

I was rejected from that program twice, though I obviously can't know if the reason was related, and I may still apply in the future.

Explicitly asking for a reference the head organizer knows personally.

That feels pretty bad to me! I can imagine some reason that this would be necessary for some programs, but in general requiring this doesn't seem healthy.

I find the request for references on the EA Funds' application to be a good middle-ground. There's several sentences to it, but the most relevant one is:

References by people who are directly involved in effective altruism and adjacent communities are particularly useful, especially if we are likely to be familiar with their work and thinking.

It's clearly useful to already be in the fund managers' network, but it's also clearly not required. Of course there's always a difference between the policy and the practice, but this is a pretty good public policy from my perspective.

I should probably be more precise and say the phrasing was something like "preferably someone who [organizer] knows".

But since this is presented as the better option, I don't think I see much difference between the two, as you'd expect the actual filtering process to favour exactly those people in the organiser's network.

I think there is  difference between "value alignment" and "personal connection"

Agreed. I was responding to:

Hiring managers should post jobs in more places, and be less dismissive of “non-EA” applicants

Although we might be more on the same page than I was thinking as you write:

I’m not saying that we should stop caring about whether candidates and employees understand and care about their organization’s mission. The mistake is assuming that the only people who understand and believe in my organization’s mission are members of the effective altruism community

I guess my position is that there may be some people who don't identify with EA who would be really valuable; but it's also the case that being EA is valuable beyond just caring about the mission in that EAs are likely to have a lot of useful frames.

Fair, but I worry that if we're not prepared for this then the costs will be greater, more sudden, and confusing

I'd be surprised if it changed that fast. Like even if a bunch of additional people joined the community, you'd still know the people that you know.

I think the extent to which "member of the EA community" comes along with a certain way of thinking (i.e. "a lot of useful frames") is exaggerated by many people I've heard talk about this sort of thing. I think ~50% of the perceived similarity is better described as similar ways of speaking and knowledge of jargon. I think that there actually not that many people who have fully internalized new ways of thinking that are 1.) very rare outside of EA, and 2.) shared across most EA hiring managers.

Another way to put this would be: I think EA hiring managers often weight "membership in the EA community" significantly more highly than it should be weighted. I think our disagreement is mostly about how much this factor should be weighted.

Fair point on the fast changing thing. I have some thoughts, but they're not very clear and I think what you said is reasonable. One very rough take: Yes you'd still the people you know, but you might go from, "I know 50% of the people in AI alignment" to "I know 10% of the people in AI Alignment" in 3 months, which could be disorienting and demoralizing. So it's more of a relative thing than the absolute number of people you know.

As an outsider slowly trying to scratch my way in…and as a person who was a founder and tight buddy of other founders of a former movement I was in, it is challenging and this post is encouraging. We had this same problem in my previous movement, it’s a normal thing, and it’s just one of those things you have to become aware of and work to overcome by, for example, asking you the author to write a follow-up post on some practical solutions after you’ve had some time to discuss this post with people and get some group-think wisdom going.

One obvious issue is simply that in the search for great new ideas it’s not likely that a small insular group is going to generate enough of them…outside thinking, foreign thinking has to stream in or you get stale and die. Think still water with no flow.

Anytime you can make outsiders feel heard and understood and hopeful of getting in you are doing something of great value. Thank you.

Good point. This made me think that a way to mitigate this problem (instead of trying to "solve" it) is by acknowledging the importance of being open to new acquaintances and of people who serve as "nodes" or "bridges" for different social networks.

Yes and I can say from experience that trying to “fix it” can lead to other problems, for example throwing out the old and forcing in the new…the original organic relationships are very valuable and can’t be artificially replicated…but it’s simply opening up more and that may require infrastructure modifications that add but don’t subtract.

Great point about mitigating as opposed to solving. It's possible that my having a "solutions" section wasn't the best framing. I definitely don't think personal connections should be vilified or gotten rid of entirely (if that were even possible), and going too far in this direction would be really bad.

Broadly, I agree. I think that social connections bringing benefit is a deeply human trait, but I think we can do better.

My suggestions:

  • Acknowledge the situation. Yes, we are a community that works a lot on interpersonal networks and we lose information and talent from people who don’t have access to those. We gain the ability to move quickly often without grifters. What are the costs and benefits in lives saved?
  • Relentlessly build systems that scale. Personal connections are much higher fidelity than submission forms, people you vibe with are usually better employees than random people. But at scale these heuristics break down - there are ideas you’ll miss without submission forms and better candidates who you won’t vibe with. I am attempting to build systems in forecasting question generation and prize generation that don’t require people to know someone to get something done. Superlinear is the org that gets this best in my opinion

Thanks, I thought this was a thoughtful post. I largely agree with the empirical analysis - that EA relies a lot on personal connections. Normatively, I'm a bit more positive to the current approach, though. For instance,  I think it can make it easier to make fast, low-cost decisions. It can also help you find good projects. 

Thanks Stefan! I agree with those strengths of personal connections, and I think there are many others. I mainly tried to argue that there are negative consequences as well, and that the negatives might outweigh the positives at some level of use. Did any of the problems I mentioned in the post strike you as wrong? (Either you think they don't tend to arise from reliance on personal connections, or you think they're not important problems even if they do arise?)

Something that didn't strike me as wrong, but as something worth reflecting more on, is your analysis of the tension between reliance on personal connections and high rates of movement growth. You take this to be a reason for relying on personal connections less, but one may argue it is a reason for growing more slowly.

Another point bearing in mind is that your (correct) observation that many EA orgs do not take general applications may itself be (limited) evidence against your thesis. For example, the Future Fund has made a special effort to test a variety of funding models, and so far they have found that their regranting program (which relies on personal connections) has delivered significantly better results than the open call applications. As the FF team writes,

We thought that maybe there was a range of people who aren't on our radar yet (e.g., tech founder types who have read The Precipice) who would be interested in launching projects in our areas of interest if we had accessible explanations of what we were hoping for, distributed the call widely, and made the funding process easy. But we didn’t really get much of that. Instead, most of the applications we were interested in came from people who were already working in our areas of interest and/or from the effective altruism community. So this part of the experiment performed below our expectations.

Of course, this is by no means decisive evidence, but it lends some support to the hypothesis that EA may be relying significantly on personal connections not because it has neglected alternative models, but because it has tested them and found them wanting.

tension between reliance on personal connections and high rates of movement growth. You take this to be a reason for relying on personal connections less, but one may argue it is a reason for growing more slowly.

I completely agree! I think probably some combination is best, and/or it could differ between subcommunities.

Also thanks for pointing out the FTX Future Fund's experience, I'd forgotten about that. I completely agree that this is evidence against my hypothesis, specifically in the case of grantee-grantor relationships.

The 80,000 Hours job board does not take applications, saying, “Due to resource constraints, we are currently unable to process unsolicited requests to list vacancies on this board.”

I think this is no longer true. From here:

We only list roles that we think are among the best opportunities according to our listing criteria. However, we realise that there will be some great opportunities out there that we are not aware of.

If there is a role you think we should be listing on the job board, please send the link (and supporting information if needed) to jobs@80000hours.org, and we will consider it for listing.

Though this is a costly process and I guess 99% of people who would otherwise submit jobs don’t. Look at the jobs on @effective_jobs that aren’t on the 80k jobs board. Often many are from top tier organisations.

80k is revamping their jobs board, so we’ll see in 6 months. Currently I think the criticism largely stands.

Good catch, thanks! I can't find my original quote, so I think this was a recent change. I will edit my post accordingly.

This should become an EA priority over the next decade, because the old model where personal networks that allows us to become efficient don't work above Dunbar's number.

[This comment is no longer endorsed by its author]Reply

I don't think this is true. Dunbar's number is a limit on the number of social relationships an individual can cognitively sustain. But the sorts of  networks needed to facilitate productive work are different than those needed to sustain fulfilling social relations. If there is a norm that people are willing to productively collaborate with the unknown contact of a known contact, then surely you can sustain a productive community with approx Dunbar's number ^2 people (if each member of my Dunbar-sized community has their own equivalently-sized community with no shared members). 

Dunbar's number has received scholarly criticism.

A widespread and popular belief posits that humans possess a cognitive capacity that is limited to keeping track of and maintaining stable relationships with approximately 150 people. This influential number, ‘Dunbar's number’, originates from an extrapolation of a regression line describing the relationship between relative neocortex size and group size in primates. Here, we test if there is statistical support for this idea. Our analyses on complementary datasets using different methods yield wildly different numbers. Bayesian and generalized least-squares phylogenetic methods generate approximations of average group sizes between 69–109 and 16–42, respectively. However, enormous 95% confidence intervals (4–520 and 2–336, respectively) imply that specifying any one number is futile. A cognitive limit on human group size cannot be derived in this manner.

More from sawyer
Curated and popular this week
Relevant opportunities