Hide table of contents

Some time ago, I was explaining to a friend what I do. I mentioned effective altruism, and his first reaction was basically, “Wait, isn’t that the FTX thing?” A few days later, my wife called me over to show me a clip from a TV show (I don’t remember the name) making fun of EA.

Those two events close to each other made me think: a narrative about EA is being created whether EAs like it or not. And if that is true, then refusing (or not putting in enough effort) to help shape that narrative might not be very effective.

Not just because reputation matters in some vague PR sense, but because reputation affects how much funding and talent a movement can attract over time. Trust and brand identification are important drivers of donation intent, and also affect whether talented people want to apply to EA-related organizations.

I also think that building its narrative matters even more for EA because it is unusually exposed to narrative risk, since it’s a movement with big claims, unusual ideas, and a style that many people already find "strange".

So here is my basic claim:

If effective altruism wants to keep attracting funding and talent, and wants its best ideas to survive contact with mainstream attention, it should put more effort into building its own narrative instead of letting it be shaped randomly.

I’m deliberately not getting into “what to do next” in this post, because that seems like a much more complex discussion (it would require a deeper understanding of what resources exist, what has already been tried, what tradeoffs are involved, etc…). But, if people find this line of thought interesting, I’d be happy to dig into that further. Let me know in the comments.

A final note: I apologize if this comes across as dismissive to the people already working hard to make EA stronger. I know you’re out there. My point is not that no one is trying. It is just that, from my perspective, too little effort seems to go into building a narrative for the general public.

19

2
0

Reactions

2
0
Comments12
Sorted by Click to highlight new comments since:

I disagree pretty strongly with this.

The biggest danger to EA is being cool. I mean this completely seriously. When EA becomes cool, people who don't care about EA show up to secure money and status. When EA becomes cool, that reputation needs defending, which is often corrosive to truth. EA being unpopular enough to deter status-seekers while not being so universally loathed that even mission-aligned nerds hesitate to associate with it for fear of social repercussions is exactly where we ought to be.

I think the argument of "EA could achieve more if it had a better reputation" is compelling, intuitive, and wrong. It seems like you're imagining a cool version of EA that tons of smart people want to join, but also maintains the same level of mission alignment and commitment to truth. I think this is actually impossible. 

EA's impact is a product of magnitude * direction. A better reputation increases our magnitude, but it's very easy for that direction to get much closer to zero. (And, since we take the money of committed altruists, a direction that is insufficiently positive is actually net-negative, thanks to opportunity cost) 

(You reminded me of this essay.)

I don't think 'having a better reputation' means 'being cool'.

I think 'having a better reputation' means people primarily associating EA with core ideas such as evidence, cost-effectiveness, impact, impartiality, counterfactual thinking (rather than e.g. FTX). 

I suspect there are plenty of 'mission-aligned nerds' out there who have been put off EA because they first hear about the bad stuff rather than the good stuff (though I also expect the overwhelming majority of 'mission-aligned nerds' simply haven't heard of EA at all). 

I think the magnitude × direction framing is really useful, and I agree the risk is real.

At EA Netherlands, I've been thinking about this through the lens of Ben Todd's notion of "community capital" — roughly, the stock of shared values, trust, human capital, coordination capacity, norm-following, and reputation that a community accumulates over time. The worry you're describing is essentially that outreach erodes community capital. And it can — if you do it carelessly.

But my ambition is to try to monitor this over time. If we surveyed relevant aspects of the community periodically — tracking its values, the degree to which people are following community norms, the actions people are taking, the degree of interconnectedness between members, and the quality of human capital coming in — we might be able to detect whether outreach efforts are degrading the direction term, and course-correct if they are.[1]

If that's feasible, it turns a binary question ("should we grow or not?") into an empirical one ("is this particular form of growth maintaining alignment?"). Some forms of outreach might pass the test, and others might not. But you'd want to check rather than assume.

I think the implicit model in your comment is one where we have to choose — stay small and aligned, or grow and dilute. But perhaps there's a third option: invest seriously in both outreach and monitoring community capital, and course-correct over time.

  1. ^

    Out of interest, has this been considered, @David_Moss?

It has been considered for the EA Survey! I'm not sure why this was never prioritised for inclusion, after being raised. But if the meta orgs we work with say they want us to include these questions in the survey going forward, we will.

Good to know! If it doesn't get included in the EA Survey, we might consider doing it ourselves at the national level (M&E budget allowing...). 

But obviously, international data that would allow us to compare across regions would be more useful.

EA diluting its message to expand would result in more unqualified people applying for jobs on the EA jobs boards, which will make them worse job boards.

Although I don't think EA should go mainstream and have its message diluted, I think this statement is wrong. Unqualified and qualified applicants would have the same probability of stumbling across EA if it ever goes mainstream. I think this idea that "smart people don't consume mainstream stuff" is very wrong. 

'When EA becomes cool, people who don't care about EA show up to secure money and status. When EA becomes cool, that reputation needs defending, which is often corrosive to truth.'

I think that's very well said. 

Good point. However, my claim is not that "EA should be cool" and we should work to make it mainstream... I pretty much agree with you on that. My point is that EA should put more effort into building its own narrative to the general public (and it doesn't mean trying to make it look cool), otherwise it will be built by someone else, and the outcome will very likely not be beneficial for EA itself.

Posting in a personal capacity.

I'm excited about EA doing more to shout about its wins and defend itself against bad-faith detractors and would love to see more discussion of what it could concretely look like for EA to take more control of its own narrative in this way.

I think the core point here is right: whether we like it or not, there is a narrative about EA that gets constructed, and if we choose not to do our own narrative shaping, then EA’s critics control more of the narrative. Choosing not to engage doesn’t make the narrative ‘authentic’ and the status quo isn’t ‘neutral’. EA has had a lot of cool wins, almost nobody outside the community knows about them, so I want to see more people pointing out the wins and pointing out how EA principles informed these wins.

I think @Andy Masley has been a great example of what this can look like in practice. Andy has written a lot about a topic not usually associated with EA, data center water usage, but has written and engaged publicly in a way that’s clearly informed by EA ideas, and he's unashamed about his connection to EA. I’d personally love to see more of this. There's an EA thought leadership gap right now, with not many people who write and speak publicly from an EA perspective. I'd love to see a new generation of people who write, speak, and engage publicly on a variety of topics from a perspective informed by EA principles.

I also don’t think that doing more of this is the same thing as trying to make EA cool or trying to expand it. It’s just trying to make sure that people have a clearer understanding of what EA principles are and what they have led to in the world so far, the good and the bad. I’d love an end state where people who are into EA ideas are clearer about how they see EA and their relationship to it, and for people not to feel embarrassed to say they're into EA ideas or part of the EA community. 

I'm keen to hear more concrete ideas for what doing more good proactive narrative shaping could look like. 

I hear you. If you think about the School for Moral Ambition for example, it is a bit that--in a way, its EA but unweird and less about philosophy, though the core principles of effectiveness remain. There is simply more of a systemic change vibe, with the tax fairness fellowship. 

I think that we are already seeing status-seeking people get into the movement, or that is my experience as I go from conference to conference. So I am not sure how much of a risk that is. Plenty of folks laid off by USAID cuts or in general lack of hiring in big GH orgs are seeing EA as a job market. Which is good, because they are usually very talented. 

I definitely think that we should invest more in creating our own narrative because so far it has been a lot of answering to criticism rather than having our own voice. Without compromising on principles, we can still control better our image since so far people know it mostly because of scandals. It hurts me that people know more about SBF than LEEP, for example

Curated and popular this week
Relevant opportunities