I was recently a guest on a podcast, and did an episode giving an introduction to EA. The podcast was Not Overthinking by Ali Abdaal and Taimur Abdaal, and the episode is called 'How can we optimise for a meaningful life?', and can be found on any sensible podcast player. The audience is mostly non-EA, and it tends to focus on productivity, goals and life-optimisation (it feels fairly rationalist-adjacent to me). 

The episode was aimed at people unfamiliar with EA, especially younger people interested in productivity and life-optimisation. I tried to focus on the optimisation angle of EA ideas, how EA engages with motivation (especially if you care about altruism on an intellectual level but not emotional level), and how EA ideas can be a central part of your career plans. 

Aspects that I think might be interesting to people already familiar with EA:

  • A different take on introducing EA ideas (with framing mostly stolen from this recent 80K episode)
    • A collection of my favourite intuitive examples for key ideas. I think good examples is a majorly underrated part of good outreach
    • Using COVID and biosecurity examples to introduce longtermist ideas
    • Connecting EA ideas to self-improvement/productivity ideas (80/20-ing, time management, motivation hacking, etc)
  • Thoughts on how to make doing good a big life priority, while keeping motivation sustainable
  • (Possibly) Something to send to friends who fit into the intended audience

Any feedback on things the episode did well or badly are welcome!

8

0
0

Reactions

0
0
Comments1
Sorted by Click to highlight new comments since: Today at 1:46 PM

Thanks for sharing and having done this! I like the optimisation framing. I can imagine that many 'natural' EAs will find it very intuitive when EA is introduced like that. 

You had an interesting discussion on the tension between valuing every person equally and caring more about your local community. I think about it like asking "What's the optimal way a person like me should think and act in those situations?". And because everyone has more influence on communities they are part of, I'd generally endorse the part of our cognition that thinks we have special responsibilities here. The tension for me comes from the insight that people from other communities are so much more in need than people in wealthier communities like those where most EAs live, so that it's optimal to rebalance our resources significantly from what's considered normal. Or from what would be optimal if everyone would follow the Optimal Way.

(By the way, I'm kind of confused why this post has only 2 karma.)

Curated and popular this week
Relevant opportunities