If you're interested in critiques of EA, you may find my just-released discussion with Michael Nielsen and Ajeya Cotra of interest!

Michael has been one of the most thoughtful EA critics, and Ajeya is an extremely sharp thinker.

Here are the topics we discuss:

What is effective altruism? Which parts of the effective altruism movement are good and not so good? Which groups of people outside of the EA movement are doing lots of good in the world? What are the psychological effects of thinking constantly about the trade-offs of spending resources on ourselves versus on others? To what degree is the EA movement centralized intellectually, financially, etc.? Does the EA movement's tendency to quantify everything, to make everything legible to itself, cause it to miss important features of the world? To what extent do EA people rationalize spending resources on inefficient or selfish projects by reframing them in terms of EA values? Is a feeling of tension about how to allocate our resources actually a good thing?

And here's a quote from the episode (where Michael complimented Ajeya and other EAs on how they respond to criticism): 

Something I have been struck with [about] Ajeya’s responses, and also the response of many other EAs who have read what I wrote, has been the extent to which they're willing to take criticism seriously and often try and reflect it back or even do the steelmanning thing…

If you'd like to hear or read the transcript of our conversation, you can go to: https://clearerthinkingpodcast.com/episode/118

Comments4
Sorted by Click to highlight new comments since: Today at 11:34 AM

I liked both this episode and the one on social justice last winter and would love to hear more semi-adversarial ones of this sort.

Thanks! :) I’m glad you enjoyed the episodes, and yes, we’d like to do more episodes where we bring on people that disagree (and find out why they disagree).

Michael on expected value calculations and legibility:

I think if you've been an effective altruist in the 1660s trying to decide whether or not to fund Isaac Newton — the theologian, astrologer, and alchemist — he had no legible project at all. That would have looked just very strange. You would have had no way of making any sense of what he was doing in terms of an EV point of view. He was laying the foundations for a worldview that would enable the Industrial Revolution and a complete transformation in what humanity was about. That's true for a lot of the things that have been the most impactful.

A quote on centralization in EA (in which Ajeya was steelmanning an argument Michael had made): 

My understanding of your position is that it's difficult to figure out what the most important causes are — to simplify a little bit and say there's one most important cause as opposed to the most important or optimal portfolio. EA has a pretty small set of intellectual leaders that are trying to think about this and then disseminating their conclusions. And because this is a really difficult project, they're going to get it wrong a lot. But they're also going to influence a pretty large number of people to work on whatever they say is the current best thing that they believe. Even if that's completely in good faith, you're like, “Well, people should just have a policy of listening less to such intellectual leaders and doing their own thing more." And that could have led to a faster convergence on the optimal cause or optimal portfolio or whatever it is.

Curated and popular this week
Relevant opportunities