Another article just dropped criticising Effective Altruism, this time in the Wall Street Journal. I'm not linking to it here because it's garbage, but if you really want to look it up, the headline is 'Effective Altruism is neither'. I'd encourage you not to, so as not to help its readership figures. [EDIT: a mirror of the article is available in the comments. But it'll put your blood pressure up.]
Other people have written very good forum posts about why we should care about perception of EA, collecting other examples of damaging commentary and suggesting some things we could do to help improve the community's image. However, I haven't seen anyone suggesting that we should create an EA PR agency, or hire a team at a PR firm to influence perception of EA. I think this seems like a very good idea.
It seems at the moment like EA is leaving a vacuum, which is being filled by criticism. This is happening in multiple languages. Much of it could easily be countered but we're not in the game.
There are all sorts of reasons not to worry too much about this particular opinion piece. Its criticisms are transparently bad, I suspect even to the audience it's written for - suggesting that pandemic preparedness is 'shutting the door after the horse has bolted' is self-evidently stupid. I doubt the readers of the WSJ opinion page are a high priority, high potential audience for EA. Even if it was devastating criticism aimed a key audience, it might have bad reach and we'd only amplify it by responding.
However, the point is that we should have some experts deciding this, rather than the current situation where no one seems to be monitoring this or trying to respond on our behalf.
It seems to me that some dedicated PR professionals could fairly quickly move to a) place more positive pieces about EA in the mainstream media; b) give more exposure to high fidelity, convincing messengers from the community (e.g. Will MacAskill); c) become the go-to place for comment on damaging pieces (which currently don't ever seem to involve a response from anyone in the community); and even d) manage to head-off some of the most illogical, most bad-faith criticisms before they're published.
I've been advised by people in PR that the most cost-effective way to do this would be to hire a team of 2-3 full-time people from the PR sector and pay them at market rates (so I guess ~$500k/year). It's possible that it would be better to do this by hiring a PR agency with a pre-existing team (which has fewer start up costs) but people who work in PR say that, over time, you just end up paying exorbitant fees if you take this approach. I'd be happy with either, but instinctively lean towards the first.
In some ways, I think EA has already missed several golden PR opportunities, not least the release of several high profile books (where there has been some decent PR but I feel there probably could have been more); and the recent pandemic, which validated much of what the community has been saying for a long time. It would be good to avoid missing future opportunities; and also satisfying to see some counter-programming to take on these sporadic poor-quality/bad-faith critiques.
Call to action: if you agree, please comment or upvote; but, more importantly, send this on to people who might be able to fund this or otherwise make it happen. If you want to discuss the idea or think you can help, please DM me.
Here are some comments on the article that I sent to my family.
Not sure what he's talking about. I think the main point of Famine, Affluence, and Morality is that if you can help someone without a significant cost to yourself, you should.
Earning to give is only a small part of EA, and I don't think it's typically a post hoc rationalization. And EAs understand very well that working directly on problems can give to society - see the first WSJ article I sent.
It's plausible that the best way to reduce vitamin A deficiency is to invest in multiple strategies at once. But if he gave a thorough argument that donating to "golden" rice infrastructure fights vitamin A deficiency more effectively per dollar than vitamin A supplementation, then I wouldn't be surprised to see GiveWell change its recommendations.
The author's comment seems quite silly to me.
I don't see anything wrong with SBF promoting a tax on extremely wealthy people to prevent pandemics (unless the resulting pandemic prevention efforts are less valuable than what the wealthy people would do with their money otherwise). In general, I'm sure some taxes are totally worth promoting.
Pandemic prevention is not a "quirky" concern!
Yes, EAs don't agree on everything, nor do I think they should. There's an emphasis within EA on updating your beliefs in response to new evidence, such as reasonable arguments from other people.
So the argument is that when deciding where to donate your money, you should use the same tactics that earned you that money in the first place? It's unclear how "cost-effectiveness" is the same as "linearity." Maybe he's advocating for donating to interventions that are like unicorn startups - interventions that could be hugely beneficial if they succeed, but probably won't do much. If so, this is kind of exactly what Open Philanthropy is doing ("hits-based giving").
It's fully possible to believe in EA principles and support capitalism. But high economic productivity can come with damaging externalities, such as increased risk of global catastrophes from new technologies.
That seems totally incorrect. GiveWell estimates that donations to its recommended charities have averted over 100,000 deaths.
This is one of the few points in the article that I like. EA (which EA headquarters likes to describe as "a project") resembles a cult in some ways: people worry about future catastrophes, care about "doing good," think about weird ideas, and dream about growing the movement.