Another article just dropped criticising Effective Altruism, this time in the Wall Street Journal. I'm not linking to it here because it's garbage, but if you really want to look it up, the headline is 'Effective Altruism is neither'. I'd encourage you not to, so as not to help its readership figures. [EDIT: a mirror of the article is available in the comments. But it'll put your blood pressure up.]

Other people have written very good forum posts about why we should care about perception of EA, collecting other examples of damaging commentary and suggesting some things we could do to help improve the community's image. However, I haven't seen anyone suggesting that we should create an EA PR agency, or hire a team at a PR firm to influence perception of EA. I think this seems like a very good idea.

It seems at the moment like EA is leaving a vacuum, which is being filled by criticism. This is happening in multiple languages. Much of it could easily be countered but we're not in the game.

There are all sorts of reasons not to worry too much about this particular opinion piece. Its criticisms are transparently bad, I suspect even to the audience it's written for - suggesting that pandemic preparedness is 'shutting the door after the horse has bolted' is self-evidently stupid. I doubt the readers of the WSJ opinion page are a high priority, high potential audience for EA. Even if it was devastating criticism aimed a key audience, it might have bad reach and we'd only amplify it by responding.

However, the point is that we should have some experts deciding this, rather than the current situation where no one seems to be monitoring this or trying to respond on our behalf.

It seems to me that some dedicated PR professionals could fairly quickly move to a) place more positive pieces about EA in the mainstream media; b) give more exposure to high fidelity, convincing messengers from the community (e.g. Will MacAskill); c) become the go-to place for comment on damaging pieces (which currently don't ever seem to involve a response from anyone in the community); and even d) manage to head-off some of the most illogical, most bad-faith criticisms before they're published.

I've been advised by people in PR that the most cost-effective way to do this would be to hire a team of 2-3 full-time people from the PR sector and pay them at market rates (so I guess ~$500k/year). It's possible that it would be better to do this by hiring a PR agency with a pre-existing team (which has fewer start up costs) but people who work in PR say that, over time, you just end up paying exorbitant fees if you take this approach. I'd be happy with either, but instinctively lean towards the first.

In some ways, I think EA has already missed several golden PR opportunities, not least the release of several high profile books (where there has been some decent PR but I feel there probably could have been more); and the recent pandemic, which validated much of what the community has been saying for a long time. It would be good to avoid missing future opportunities; and also satisfying to see some counter-programming to take on these sporadic poor-quality/bad-faith critiques.

Call to action: if you agree, please comment or upvote; but, more importantly, send this on to people who might be able to fund this or otherwise make it happen. If you want to discuss the idea or think you can help, please DM me.

196

0
0

Reactions

0
0

More posts like this

Comments57
Sorted by Click to highlight new comments since: Today at 9:49 PM

Hey, yeah, for the last few months CEA and Forethought and a few other organizations have been working to try to help accurately explain EA and related ideas in the media. We've been working with experienced communications professionals. CEA recently also hired a Head of Communications to lead these efforts, and they're starting in September. I think that it was a mistake on CEA's part not to do more of this sooner.

I think that there might be a post sharing more of about these efforts in the future (but not 100% sure this will happen).  

Thanks Max - really good to hear. Will CEA's Head of Comms be focussed more on CEA comms or EA Movement comms? I see some other comments about not over-centralising this but I also worry about capacity for one person/someone with more than one brief to monitor and be proactive about the whole space?

Definitely a really positive development though.

Sorry for not being clear. They'll be focused on communications for effective altruism (obviously working with a lot of our partners), rather than for CEA itself. We might expand the team further once the Head of Comms is in place, but this is the start.

That's really excellent to hear.

My understanding is that this has indeed been an unfortunate vacuum but as of a few months ago plans are now underway to fix this. So I can say that at least some "people who might be able to fund this or otherwise make it happen" are working on it, though I'm not part of these plans, I don't have much detail, and I won't claim that the plans will actually work (or that they won't work - I don't know).

I do think if anyone else decides to work on this it would be great if they would coordinate. I think it would be bad for us to have multiple non-coordinating media strategies targeted at "effective altruism" specifically.

I'm working on an EA consultancy startup (still in planning phases) but with my experience running a large scale operation, I could fold this kind of talent into the consultancy portion and then farm that out. Part of my job is media training and perhaps that is something that EA lacks? I need to research this more but I tend to get the feeling that EAs disregard politics when it's not convenient or don't engage as much as they could. Messaging is politics as it's basest level, and I'll probably write a forums post on this later- it's about building rapport and speaking clearly what you are doing.  Comment on this with your thoughts as to what you would want with a PR team or media training so I can make some red teaming stuff about it.

Have you spoken to Jona Glade about it? He’s also working on setting up a consultancy. I’m also happy to chat about this.

I have not. If you could connect the two of us, I would appreciate it. I'll message you this week when I get a chance to talk shop and I hope it will be a productive discussion.

Yeah I’d be interested in hearing more about that

I wrote a bit about this here: Public Relations: Message & Rapport 

I'll probably have a bit more about this as I think about what specific messaging "how to" would be useful for EAs.

In this particular case, the fact that it’s a poorly thought out criticism and the points are pretty easy to refute …

To me speaksfor responding and refuting it. It’s like a reverse straw man, a softball pitch setting us up for a home run response.

Maybe classic WSJ editorial readers are low tractability audience for EA. But I think it still gets some exposure outside of conservative circles. And responding could draw some of them in.

I agree.

Mirror of ‘Effective Altruism’ Is Neither, the article in question. As it is a non-direct mirror should not affect readership numbers.

Thanks. I was inspired yesterday to do a point by point addressing of the piece. Feels a little "when you wrestle with a pig, you get muddy and the pig likes it", but spoiler alert I think there's nonzero worthy critique hiding in the bad writing.

Workers will rationalize high-paying jobs by giving most of their income away. Actually, when you work, you already give to society, but that is too complex for some to understand.

I think EAs live in the space between the extreme "capitalism is perfectly inefficient such that a wallstreet compensation package is irrelevant to the (negligible) social value that a wall street worker produces" and the equally extreme "capitalism is perfectly efficient such that a wallstreet compensation package is in direct proportion to the (evidentially high) social value that a wallstreet worker produces". Also, insofar as capitalism is designed and not emergent, is it really optimized for social value? It seems optimized for metrics which are proxies for social value, and very much subject to goodhart, but I'll stop before I start riots in every last history and economics department. Moreover, how about we want more number go up? If number go up is good, and working some arbitrary gig in fact makes number go up, donating some of the proceeds will make number go up more, so E2G people are correct to do both! 

Animal rights and veganism are big in the movement as well.

Sorry this reads to me like applause lights for the "I hate those smug virtue signaling vegans because I love bacon" crowd. OP's thesis about EA doesn't really relate to our unusually high vegan population, they might as well have pointed out our unusually high queer or jewish or computer programmer population. 

Yes, they direct money toward malaria nets and treatments for parasitic worms, but they also supply supplements for vitamin A deficiency, though genetically modified “golden” rice already provides vitamin A more effectively. Hmmm, seems like a move backward.

Sorry one sec I'm laughing a little at this "what have the romans ever done for us?" moment. "yeah, besides the malaria nets and deworming, which I admit are a plus, what have the EAs ever done for the poor?" it's like monty python! Anyway, my friend, if you think golden rice is a neglected approach to vitamin A deficiency, are you rolling up your sleeves and advancing the argument? Do you even bother to cite evidence that it's more effective? "Hmmm, seems like a move backward" is a completely unjustified and frivolous sentence. 

That’s a bit like closing the barn door after the horse has bolted.

EAs do not subscribe to the interpretation of the theory of random variables that you imply! We do not believe that random variables conserve a supply of events out in the universe of potentiality, such that an event of a particular class drains the supply of events of that class from the future. We instead believe that events of a class occurring does not imply that there's less of that class of event available to occur in the future. In fact, if anything we believe the opposite, if anything we believe that observing an event of a class should update us to think they're more likely than we did before we observed it! Moreover, EAs are widely on record advocating for pandemic preparedness well before covid. 

Partly as a result of his and his brother’s efforts, $30 billion for pandemic preparation was written into the Biden administration’s thankfully stalled Build Back Better porkfest.

From a writing style perspective, this is blatant applause lights for the tribe of those who think build back better is bad. 

Catch that? Someone else pays. Effective, but not exactly selfless. It’s the classic progressive playbook: Raise taxes to fund their pet projects but not yours or mine. I don’t care if altruists spend their own money trying to prevent future risks from robot invasions or green nanotech goo, but they should stop asking American taxpayers to waste money on their quirky concerns.

Not wrong. Policy efforts inevitably lead to this line (from this crowd at least), unless they're, like, tax-cutting. Policy EAs are advancing a public goods argument. It opens us up to every lowering-my-taxes-is-ITN guy that every single public goods argument in the world is opened up to. I don't need to point out that OP surely has pet projects that they think ought to be funded, by taxes even, and I can omit conjectures about what they are and about how I personally feel about them. But this is a legitimate bit of information about EA policy efforts. (Obviously subject to framing devices: tax increments are sufficiently complex that a hostile reader would call something "increase by 0.75%" while another reader would say "pushing numbers around the page such that the 0.75% came from somewhere else so it's not a real increment" and neither would be strictly lying). 

And “effective” is in the eye of the beholder. Effective altruism proponent Steven Pinker said last year, “I don’t particularly think that combating artificial intelligence risk is an effective form of altruism.”

I'll omit how what I actually think about Pinker, but in no worlds is this settled. Pinker is one guy who lots of people disagree with! 

There are other critics. Development economist Lant Pritchett finds it “puzzling that people’s [sic] whose private fortunes are generated by non-linearity”—Facebook, Google and FTX can write code that scales to billions of users—“waste their time debating the best (cost-effective) linear way to give away their private fortunes.” He notes that “national development” and “high economic productivity” drive human well-being.

Seems valid to me. Nonlinear returns on philanthropy would be awesome, wouldn't they? It's sort of like "if a non-engineer says 'wouldnt a heat-preserving engine be great?' we don't laud them as a visionary inventor" in this case, because I don't expect OP to roll up their sleeves and start iterating on what that nonlinearly returning mechanism would look like! But that doesn't mean we shouldn't take a look ourselves. 

There are only four things you can do with your money: spend it, pay taxes, give it away or invest it. Only the last drives productivity and helps society in the long term.

This should clearly be in our overton window about how to do the most good. It almost alludes to the excellent Hauke Hillebrandt essay doesn't it? 

Eric Hoffer wrote in 1967 of the U.S.: “What starts out here as a mass movement ends up as a racket, a cult, or a corporation.” That’s true even of allegedly altruistic ones.

This seems underjustified and not of a lot of substance. I think what OP has portrayed may qualify as a racket to people of a particular persuasion regarding government spending, or as a cult to the "I intuitively dislike virtue signaling and smugness so I look for logical holes in anyone who tries to do good" crowd, but OP could have been more precise and explicit about which of those they think is important to end on. But alas, when you're in a given memeplex that you know you share with your audience, you only have to handwave! lol 


As Scott Alexander recently addressed, EAs are like a borg: we assimilate critics of any quality bar whatsoever. As much as we respect Zvi "guys, I keep telling you I'm not an EA" Mowshowitz' wishes to not carry a card with a lightbulb heart stamped on it, it's pretty hard not to think of him as an honorary member. My point is we really should consider borg-ing up the "taxation is theft" sort of arguments about public goods and the "investment beats aid" sort of arguments about raising global welfare. 

Ironic the title of his column is "Inside View"

[content warning: sarcasm; less-than-charitable commentary on critics of EA]

It's not that ironic if the title is meant to imply the inside view is the only one he cares about.

I'll oblige others who don't want the WSJ op-ed to get more attention by only responsible to the article in this comment thread. I don't expect specifically trait-feign or any other person to respond to this comment, though it'd be appreciated if anyone can provide relevant information. 

FTX bought the naming rights to the Miami Heat’s arena and lots of umpire and referee uniforms. Since May, he has been bailing out failing crypto firms

This is the first I've read of this. This doesn't sound  good or effective at face value. It's being used to make EA look bad. That doesn't mean the attempt has succeeded. Others have commented about how many bogus assumptions in this op-ed are easy to debunk. It's also written in a sensational way that makes this second claim not sound too serious. 

Surely SBF has some argument for why he has been, if the characterization is accurate, bailing out failing crypto firms. They could be poorly reasoned arguments for why it's a good idea. Bailing out these failing crypto firms may end up being what was a big, foreseeable mistake. Whether SBF's giving really is effective matters more than only one op-ed spins it as ineffective. Knowing the reasons and evidence is what matters.

Assuming for the sake of argument that so many of SBF's and FTX's giving or investments turn out to be very bad bets, it's not hard for the EA community to make clear that the EA community at large neither advised nor endorsed such choices. Nobody would say buying the naming rights to the Miami Heat's arena was one of EA's top recommendations for what FTX should do with that money. 

The EA community has already begun making distinctions like that clear. A few of the most upvoted articles of 2022 so far have been ones critical of FTX's giving approach/methodology, or the prospect of EA community getting particularly involved with SBF's political plays. Even given assumptions of such a bad scenario, which could easily be false anyway, a future scenario like this with EA having a PR team or whatever seems like it wouldn't be that hard to deal with. 

And Mr. Bankman-Fried’s various entities, along with Cari Tuna and others, have put up about $19 million for a future California ballot measure, the California Pandemic Early Detection and Prevention Act, which would add a 0.75% tax on incomes over $5 million to raise up to $15 billion over 10 years. Catch that? Someone else pays. Effective, but not exactly selfless.

It’s the classic progressive playbook: Raise taxes to fund their pet projects but not yours or mine. I don’t care if altruists spend their own money trying to prevent future risks from robot invasions or green nanotech goo, but they should stop asking American taxpayers to waste money on their quirky concerns.

The author isn't a grifter only for doing a job but he is kind of a hack in that it's evident he is pandering to a particular "own the libs" readership he has in mind.  

  1. Politically progressive or liberal multi-millionaires or billionaires are far from the only ones who advocate raising taxes or even back campaigns that will raise taxes.
  2. There are a lot of billionaires who've advocated their own taxes being raised. Warren Buffet has. Elon Musk has said stuff about how he doesn't want to pay higher taxes because he doesn't trust the government to spend the effectively but he'd support it more if the government were to spend the money more effectively, e.g., on high-impact existential risk reduction. Even some conservative billionaires are in favour of the government expanding in a least some ways even if they don't want to have to pay higher taxes for it. 

    The author puts almost no effort into an argument for why not even billionaires have a right to support a campaign to raise taxes on other billionaires. 
     
  3. Who he is pandering too are readers who are themselves probably not super-wealthy either. He refers to "American taxpayers" as if it's taxpayers in general as opposed to the very small minority o them that earn more than $5 million per year. 
     
  4. The author conflates how some x-risk scenarios superficially sound like silly science fiction with pandemic preparedness being silly. Before the pandemic, most people would think that's ridiculous. After the pandemic, it could be even most conservatives in the USA who'd think that's ridiculous.

 The last couple paragraphs are a couple random quotes out of context critical of some aspects of EA. (Other claims made in the article have been commented on by others and I've got nothing else to add.)
 

There seems to me to be a fallacy here that assumes every action SBF takes needs to be justifiable on its first order EA merits.

The various stakes FTX have taken in crypto companies during this downturn are obviously not done in lieu of donations - they are business decisions, presumably done with the intention of making more money, as part of the process of making FTX a success. Whether they are good decisions in this light is hard for me to say, but I'd be inclined to defer to FTX here.

I was thinking through such a possibility descriptively, and how the EA community might respond, without trying to prescribe the EA community in a real-world scenario. I didn't indicate that well, though, so please pardon me for the error. 

To clarify, given the assumptions that criticisms of SBF's or FTX's investments or donations might be used to attack EA as a movement by association, and the EA community also had some responsibility to distance itself from those efforts, it wouldn't be that hard to do so. I personally disagree with the second assumption. 

I'm of the opinion the EA community has no such responsibility but it seems at least some others do.

SBF seems to have made some mistakes with his recent forays into politics but they don't strike me to have been as bad as at least a significant minority of the EA community believes. My opinion is that the need some felt for the EA community to distance itself from SBF's political activities was excessive. 

The various stakes FTX have taken in crypto companies during this downturn are obviously not done in lieu of donations - they are business decisions, presumably done with the intention of making more money, as part of the process of making FTX a success. Whether they are good decisions in this light is hard for me to say, but I'd be inclined to defer to FTX here.

I agree with all of this. There are plenty of companies that have taken long(er)-term bets like the one FTX is making that have turned out to be among the best business decisions of the 21st century. Facebook, Amazon and companies Elon Musk has bought were not profitable for almost a decade. They were marred by criticisms and predictions of how they were always on the brink of imminent collapse. That was all bogus.

It's worth keeping survivorship bias in mind and the fact that some bets made like this wound up as catastrophic business decisions. Yet it's not justified to assume by default FTX's investments in this way will end up as bad rather than good decisions. That's especially true in the absence of more information. The author hasn't provided any such information and is not likely to have access to such information either. 

It seems like more pandering. I'm guessing the author is the kind who would've maligned Musk when he was a Democrat but now because Musk is a Republican defend decisions he might have criticized before.

Here are some comments on the article that I sent to my family.

In 1972 philosopher Peter Singer suggested using metrics rather than emotion to direct charitable giving.

Not sure what he's talking about. I think the main point of Famine, Affluence, and Morality is that if you can help someone without a significant cost to yourself, you should.

Effective altruism also seems to be related to the “work to give” movement. Workers will rationalize high-paying jobs by giving most of their income away. Actually, when you work, you already give to society, but that is too complex for some to understand.

Earning to give is only a small part of EA, and I don't think it's typically a post hoc rationalization. And EAs understand very well that working directly on problems can give to society - see the first WSJ article I sent.

An organization known as GiveWell will tell you what charities are effective. I did a little digging, and I’m not so sure they’re effective at all. Yes, they direct money toward malaria nets and treatments for parasitic worms, but they also supply supplements for vitamin A deficiency, though genetically modified “golden” rice already provides vitamin A more effectively. Hmmm, seems like a move backward.

It's plausible that the best way to reduce vitamin A deficiency is to invest in multiple strategies at once. But if he gave a thorough argument that donating to "golden" rice infrastructure fights vitamin A deficiency more effectively per dollar than vitamin A supplementation, then I wouldn't be surprised to see GiveWell change its recommendations.

William MacAskill, a major effective-altruism booster, told the Washington Post that more should be spent on “preparing for low-probability, high-cost events such as pandemics.” That’s a bit like closing the barn door after the horse has bolted.

The author's comment seems quite silly to me.

And Mr. Bankman-Fried’s various entities, along with Cari Tuna and others, have put up about $19 million for a future California ballot measure, the California Pandemic Early Detection and Prevention Act, which would add a 0.75% tax on incomes over $5 million to raise up to $15 billion over 10 years. Catch that? Someone else pays. Effective, but not exactly selfless.

I don't see anything wrong with SBF promoting a tax on extremely wealthy people to prevent pandemics (unless the resulting pandemic prevention efforts are less valuable than what the wealthy people would do with their money otherwise). In general, I'm sure some taxes are totally worth promoting.

I don’t care if altruists spend their own money trying to prevent future risks from robot invasions or green nanotech goo, but they should stop asking American taxpayers to waste money on their quirky concerns.

Pandemic prevention is not a "quirky" concern!

And “effective” is in the eye of the beholder. Effective altruism proponent Steven Pinker said last year, “I don’t particularly think that combating artificial intelligence risk is an effective form of altruism.”

Yes, EAs don't agree on everything, nor do I think they should. There's an emphasis within EA on updating your beliefs in response to new evidence, such as reasonable arguments from other people.

Development economist Lant Pritchett finds it “puzzling that people’s [sic] whose private fortunes are generated by non-linearity”—Facebook, Google and FTX can write code that scales to billions of users—“waste their time debating the best (cost-effective) linear way to give away their private fortunes.”

So the argument is that when deciding where to donate your money, you should use the same tactics that earned you that money in the first place? It's unclear how "cost-effectiveness" is the same as "linearity." Maybe he's advocating for donating to interventions that are like unicorn startups - interventions that could be hugely beneficial if they succeed, but probably won't do much. If so, this is kind of exactly what Open Philanthropy is doing ("hits-based giving")

He notes that “national development” and “high economic productivity” drive human well-being. So true. History has proved that capitalism is the most effective and altruistic system.

It's fully possible to believe in EA principles and support capitalism. But high economic productivity can come with damaging externalities, such as increased risk of global catastrophes from new technologies.

There are only four things you can do with your money: spend it, pay taxes, give it away or invest it. Only the last drives productivity and helps society in the long term.

That seems totally incorrect. GiveWell estimates that donations to its recommended charities have averted over 100,000 deaths.

Eric Hoffer wrote in 1967 of the U.S.: “What starts out here as a mass movement ends up as a racket, a cult, or a corporation.” That’s true even of allegedly altruistic ones.

This is one of the few points in the article that I like. EA (which EA headquarters likes to describe as "a project") resembles a cult in some ways: people worry about future catastrophes, care about "doing good," think about weird ideas, and dream about growing the movement.

Highly interesting. As pr professional, I am happy to share my initial thoughts:

I think there are three main action fields for PR at EA:

  1. Monitor and build knowledge, e.g., Who is writing in which way about EA? What are the arguments? How to strengthen or counter them? Who are supporters? Who are potential supporters? How we can we reach them? etc
  2. Actively manage reputation of EA, e.g., PR campaigns, crisis communications, media appointments, define overall strategy, etc
  3. Support and give guidance to EA ventures and initiatives, e.g., better fundraising through more public visibility, higher acceptance of measures/work of the initiative, improving employer branding through PR etc

Regarding the question ‘agency vs internal solution’:

  • If you opt for external support, you don’t build PR knowledge inside the organisation, you don’t make and develop media contacts. At the contrary, you make yourself dependent on the agency.
  • A significant and important part of PR is to manage expectations and translate/explain the needs and perspectives of journalists and other external stakeholders to internal stakeholders and vice versa. I expect this to be much more complicated in a decentralised movement. This is something that can’t properly be done from the outside. So you need internal PR professionals for that.
  • The cost argument against external agencies is valid. I think it is not the most important one though.
  • On the other side, external teams can bring in highly valuable expertise and established contacts quickly. And they are more flexible in terms of being able to increase or decrease PR activities without having to manage internal resources.

The bottomline is, if PR is something EA wants to tackle, external support can make sense for any of the 3 action fields I listed above. I strongly recommend though, to have at least a core team to manage PR (and manage external agencies if deployed) + build knowledge and expertise inside the movement. 

[anonymous]2y12
0
0

Hey - this is great, and very reassuring. It's great that all this exists!

I would just point out one distinction, which is between 'marketing' and 'PR'. I'm not well-versed in this, but this is how I understand it: marketing = trying to get people to do your thing (access your services, donate, come to your conference, join your movement); PR = managing the public's perception of you (getting positive press, combating bad press, crisis management). I think the two often overlap (e.g. in general awareness raising) but aren't the same.

If this is right, a lot of the above is marketing, where I completely agree that there have been great strides recently (e.g. more Comms directors at EA orgs, the EA Market Testing initiative, the new EA digital marketing agency etc. etc.); but there seems to me to be a lot less PR (e.g. managing mainstream press coverage). So while One for the World is an enthusiastic part of message testing to try to boost donations, we're not aware of any efforts to place more positive press coverage of effective donations in general.

All that said, these are all great initiatives and it's exciting to see them come to fruition.

If it is the case that dollar for dollar, donation to effective Global Health and Development charities can be shown to contribute to economic growth more efficiently than typical investment (which I believe is the case) the studies showing this need to be readily available and trumpeted by EAs. I would think this is actually the case because of how cheaply these interventions tend to activate and cultivate human capital, but it's not an area that I've studied. If I recall correctly, I was surprised that some 80k guests expressed skepticism that such charities were an effective means to promote economic growth.

Of course, there are factors other than contribution to economic growth, such as curtailment of suffering, that are important too. But if there is a strong rejoinder to the investment >growth > welfare argument, it should be readily deployable not just by an EA PR agency, but by the rank and file.

My primary concern isn't that articles paint EA as good - it's rather that they paint EA as what it actually is.

As for the particular article - I'm not exactly sure it's untruthful in its depiction of EA - seems to me like the bad parts are the reactions to EA ideas. I also think it's valuable as a conservative criticism in a sea of liberal ones - it's relevant for how to expand the movement to a less represented audience.

I was with you until you said it's valuable. Conservative or not, the criticism seems so to be written by someone who either hasn't given EA more than 20 minutes of careful thought (despite writing an article about it) or isn't capable of thinking in a sane manner. I don't really know how to approach such people and I don't think it teaches me much about conservatives in general (I hope not all of them think like this).

Seriously, what do you make of statements like this: 

William MacAskill, a major effective-altruism booster, told the Washington Post that more should be spent on “preparing for low-probability, high-cost events such as pandemics.” That’s a bit like closing the barn door after the horse has bolted.

When I said valuable, I meant "it tells us what conservatives may respond and how to address them, so it has value for us."

Not "I agree with it or think it's reasonable."

Edit: to illustrate, the exact quote you gave is one I originally considered quoting myself, followed by a facepalm emoji. But now I know people might think what it says, and I didn't before.

I'm not sure I follow why conservative criticism would lead to expanding the movement to a less represented audience

It informs you on how to approach conservatives.

Thanks! Makes sense. 

Most of the criticisms of EA may be more "leftist" than liberal. If you don't know, "leftist" is a catch-all term for 'left-of-liberal' ideologies, i.e., ones thoroughly to the left of the mainstream of the Democratic Party, or even social democrats. 

It's no fault of EA to not understand the distinction well because leftists themselves are often barely able to distinguish if by "leftism" they mean some kind of socialism or something else, or where the dividing line is between liberalism and far-left ideologies. 

Anyway, some of the more liberal movements, like 'The Neoliberal Project,' are among the few that are voluntary proponents of EA. 

It's no fault of EA to not understand the distinction well

I don't know about EA, but I'm from another country with a different political map, so I'm trying to approximate US politics and don't really dinstinguish between "liberals", "progressives", "leftists" etc.

In my country (Israel) we mostly think in terms of "left" and "right", and economic liberalism is farily new.

Yeah, I'm from Canada, which is both similar and close enough to the USA for the country's politics to be more understandable to us. There are others in EA from Europe who occasionally have some difficulty understanding the complexities of American politics too. Anyway, while it's worth checking with some Americans in EA who might know better, my impression is that most of the criticisms of EA from the political left have been from those further left of mainstream Democrats in the United States (e.g., Clinton, Obama, Biden, etc.).

I wrote up some relevant thoughts recently in a comment (though per the reply thread I think I misunderstood the article I was replying to). I agree with much of the sentiment here, though I’m not as thrilled with the PR approach specifically, I associate PR departments more with organizations than movements, and I think EA frequently runs the risk of being too centralized as is.

Devin's comment was on a post I wrote last week. It was better than my post. I meant to post a comment sharing and recommending it to others to read until I noticed Devin had already shared the comment. I agree with almost every point Devin made. I recommend others participating in this current conversation read his comment too.

Another WSJ letter on the topic (Where Effective Altruism and Capitalism Converge) was posted on about 4 August (I can't access it; one hopes it is more positive), and this week EA is featured in Time AND  The New Yorker. The new hire for CEA Comms, covering general EA messaging, appears to be well timed, as the pot seems to be bubbling. 

I'm an ignoramus when it comes to PR but this seems like such an obviously good idea that it makes me wonder if there are good reasons why it hasn't been implemented.

Like maybe a PR team would make EA more impervious to actually good and useful criticism.

But maybe that isn't necessary, and if there are no good reasons against a professional EA PR team, then you have my full support

The EA movement usually tries to take good criticisms seriously. EA seeks to be an intellectual and scientific movement as well. One of the reasons why EA invests so much in self-criticism is because EA can't get enough good criticism from outside sources and criticism remains an important part of peer review. The worst problem isn't how some criticisms judge EA in general but criticisms that are based on misconceptions, or misunderstandings, and thus spread false information about EA.

(to save some folks some time clicking or reading) the article makes a case that EA goals are subject to the seductively evil drive to collect taxes for public goods funding, and I personally feel bad I didn't predict that argument to come from this crowd once I saw that Sam, Dustin, and Cari  were making specific political plays

(excuse my US-centrism) Usually when I think about the EA republicans question, I think about nationalism, religiosity, some core "liberty is welfare and has good externalities" principles around homosexuality and freedom of movement, but this article updated me to also think about taxes (not that I think republicans are actually against taxation in any sense, but just that there's nonzero information in what they choose to write on the tin). 

I've seen left-wing criticisms of EA being a movement that wants to decrease taxes, especially on/in the crypto industry, based on how the critics perceived Sam's political plays. (I'm not aware of any such recent criticisms commenting on Dustin's and Cari's donations of any kind.) 

You wrote,

"....Even if it was devastating criticism aimed a key audience, it might have bad reach and we'd only amplify it by responding."

If it is devastating criticism reaching a key audience, wouldn't it be helpful to identify its criticisms directly and create discussion around them? Your post makes me want to subscribe to the journal just to find out for myself. 

Yes, I agree. I was just trying to explore briefly why people might think this was a bad use of time/money, and thought 'don't give this stuff oxygen' might be one of those arguments. But it's not one I agree with.

OK, so I read the argument, mm, it's basically associating EA with vegans, crypto, animal rights activists, and pro-tax policies. The OP believes investment serves to lift the poor out of poverty better than charity. There's not a lot to address, really, since a lot of the premises are true. 

It would be nice to know:

  • how investment compares to charity in alleviating poverty
  • whether EA advocated for pandemic preparedness before 2020
  • why vitamin A supplements are supplied in places where golden rice might be commonly consumed

It would be easy to take the OP's statements of what is shameful about EA and turn them into arguments for what to be proud of about EA.  An op-ed response to the original op-ed might be helpful. I would avoid all the EA jargon and stick with plain english, stark facts, and obvious pride in EA's accomplishments.

whether EA advocated for pandemic preparedness before 2020

The biggest funding organization for EA-prioritized causes is Good Ventures (GV). Most of their philanthropic giving is on the recommendation of Open Philanthropy (OP). OP is one of if not the most significant organization of its kind in EA. "Biosecurity and pandemic preparedness" has been its main focus area for long-termism/x-risk reduction, after risks from advanced AI, since 2015. GV has on the recommendations of OP given almost $130 million USD to date to efforts focused on biosecurity and pandemic preparedness. 

GV/OP donated large amounts to pandemic preparedness before 2020, but the rest of the EA community did not. When I looked at LTFF grants in December 2020, I found 5 grants for $114k for pandemics (by comparison, AI had received ~19x more money). If you download GWWC’s list of reported donations (which includes post-2020 data), the only dedicated biosecurity organizations in the top 100 are Johns Hopkins Center for Health Security (#58, 17 donors giving $257k) and Telis Bioscience (1 donor giving $250k, and FWIW I’m pretty sure this isn’t nonprofit). That’s $507k combined, less than half of a percent of the total given to the top 100 organizations.

My impression is that the EA community talked a lot about pandemics prior to Covid, and made them a relatively high priority in career advice. But aside from GV/OP, I haven’t seen evidence of sizeable donations. So my generally sense is (non-GV/OP) EAs are too quick to take credit for being ahead of the curve in this area, at least with respect to money (which I think is a good indicator of actual priorities).

You make some extremely good points. I'm not sure off the top of my head when I'll have time to respond with the care your doesn't

$130 million? Nice, I don't know how that compares to the totals spent by other NGO's or the US government or some organization like the WHO, but the number shows EA commitment to pandemic preparedness over time, well before SARS-COV-2 became a concern.

Frankly, the author of the Journal piece (who I called the OP, sorry if that was confusing) did do a bad job, because:

  • he attacked an organization that funds charitable giving. To do it well you gotta claim there's loads of corruption or that the causes are something unpopular or obscure (like subsidizing tofu or saving endangered wild tree mammals in Borneo or whatever). The most he managed to insinuate is that you EA folks like to encourage public sector efforts and that your cause choices are redundant.
  • he downplayed efforts to help pandemic preparedness. Correct me if I'm wrong, but that's a popular issue right now, isn't it, you just need to mention that said EA folks had the foresight to begin work on pandemic preparedness in 2015.
  • he made an obvious attempt to align himself with folks that hate taxes. Unfortunately for him, those same folks like charitable giving, and the majority of EA involves philanthropy, not special interest work.
  • he went for unpopular ("liberal") associations, and you folks are a bit weird, but that's because you're super-smart. His focus on the liberal, leftie associations also doesn't really matter. It should be easy to ignore in your response, or tackle directly.

The only way to screw up a response to the guy would be to miss how he's put himself in a corner and then write as weird a response as possible. For example,  "with medium epistemic confidence we predict, with a 5% confidence interval based on scientific studies (see the footnotes), that the majority of our s- and x-risk causes conform to preference utilitarianism, meaning that both our longtermist and short-term outcomes are consistent with our aforementioned values of.." Don't do that.

But other than that, whether you all decide to hire a marketing firm or not, or put a formal plan together or not, this isn't a big deal. I think a great way to go is just a heated back and forth in the Journal op-ed section. I've read enough arguments to see that you can win this one easily.

Good luck with it all.

It's possible that it would be better to do this by hiring a PR agency with a pre-existing team (which has fewer start up costs) but people who work in PR say that, over time, you just end up paying exorbitant fees if you take this approach. 

Organizations in the EA ecosystem often have trouble hiring qualified candidates from outside EA who also understand the specialized technical knowledge needed for most work in EA. A single, permanent PR team focused exclusively on EA would be better able to learn EA better over time and be the most competent kind of candidates. 

This is consistent with the other advice I've been given. Apparently you need some people to really 'specialise' in your area, so that they can do high fidelity messaging, be the go-to place for comment etc.; so this seems like a good argument for trying to find a small group of EA-aligned (or 'teachable') professionals and then buy their time.

I've been advised by people in PR that the most cost-effective way to do this would be to hire a team of 2-3 full-time people from the PR sector and pay them at market rates (so I guess ~$500k/year).

The quoted $500k/year is for all 2-3 of said full-time people, not $500k per person, right?

Yes, although this was a total guess, assuming $200k/person for two senior people and then $100k for a more junior person. I have no idea what good PR professionals earn.

There is also a practical issue of trying to have multiple people covering different markets/languages, but I guess you use consultants strategically to overcome that.

"It is only when you don’t care about your reputation that you tend to have a good one."
-Nassim Taleb 

I downvoted this post – I think it's unhelpful to write a polemic complaining that "X isn't being done" without taking basic steps to find out what's already being done, or first writing a post asking about what's being done.

For instance:

  • Next week a major PR campaign to promote Will's book will begin.
  • Open Phil, CEA, 80k and others are already advised by a professional PR agency.
  • There are many in-progress efforts to get EA more into the media (e.g. it's a key focus for Longview).
  • There's a communications strategy being drafted.
  • Etc.

Thanks Ben. A few things:

  • I read the posts I could find on this topic on the forum, none of which mention a PR agency or hiring a PR team for the movement
  • I've talked about this with a lot of other EA professionals and no one has mentioned that this idea is coming, although all of them thought it was a good idea to some degree
  • I posted it as an idea for FTX and it wasn't taken up, without feedback or any suggestion that it was already happening
  • I visited the many examples of mainstream press criticism of EA cited in other posts and saw no response or comment 'from EA'

But, most importantly, the things you list here don't address the suggestion of this post. Individual orgs being advised by PR professionals, or Longview aiming for more press coverage, has at best partial overlap with the effect of a dedicated PR team for EA.

This is also self-evidently not having the effect of countering op-eds like this one (although that might be low priority work).

So, when OFTW, at least one GiveWell charity and presumably others are forwarded this op-ed by potential or actual major donors, and there is crickets 'from EA' in response, I think it's pretty reasonable to say 'this seems like a good idea - can we make it happen?'

I think it's valuable to point out a problem. The fact is that the majority of media articles about EA are negative (and often inaccurate), and this has been the case for years. Inasmuch as this is a problem, all existing efforts to solve it have failed! Listing upcoming efforts seems like more of a nice addition than a mandatory component.

I interpreted this post as also complaining about there not being any sort of consesus among EAs as a community for how to deal with negative press. Most of what you listed would not qualify as a community-wide strategy. Sure, Open Phil, CEA, 80k, etc. having some PR guidance seems awesome, but what about everyone else? I'm guessing there are many orgs, individual people who have a lot of twitter followers or whatever, etc. that don't know how to respond to negative press. 

  • There's a communications strategy being drafted.

seems to be the only thing you listed that actually addresses this. Which is great! But it seems pretty unrealistic to expect OP to know that this was happening. I'm also guessing this wouldn't have been answered in an EA forum question; questions don't seem to gain that much traction on EA forum, and the vast majority of users probably had no clue a communications strategy was being drafted.  

Curated and popular this week
Relevant opportunities