Hey everyone, I’ve been going through the EA Introductory Program, and I have to admit some of these ideas make sense, but others leave me with more questions than answers. I’m trying to wrap my head around certain core EA principles, and the more I think about them, the more I wonder: Am I misunderstanding, or are there blind spots in EA’s approach?
I’d really love to hear what others think. Maybe you can help me clarify some of my doubts. Or maybe you share the same reservations? Let’s talk.
Cause Prioritization. Does It Ignore Political and Social Reality?
EA focuses on doing the most good per dollar, which makes sense in theory. But does it hold up when you apply it to real world contexts especially in countries like Uganda?
Take malaria prevention. It’s a top EA cause because it’s highly cost effective $5,000 can save a life through bed nets (GiveWell, 2023). But what happens when government corruption or instability disrupts these programs? The Global Fund scandal in Uganda saw $1.6 million in malaria aid mismanaged (Global Fund Audit Report, 2016). If money isn’t reaching the people it’s meant to help, is it really the best use of resources?
And what about leadership changes? Policies shift unpredictably here. A national animal welfare initiative I supported lost momentum when political priorities changed. How does EA factor in these uncertainties when prioritizing causes? It feels like EA assumes a stable world where money always achieves the intended impact. But what if that’s not the world we live in?
Long termism. A Luxury When the Present Is in Crisis?
I get why long termists argue that future people matter. But should we really prioritize them over people suffering today?
Long termism tells us that existential risks like AI could wipe out trillions of future lives. But in Uganda, we’re losing lives now—1,500+ die from rabies annually (WHO, 2021), and 41% of children suffer from stunting due to malnutrition (UNICEF, 2022). These are preventable d
@Aaron Bergman's Pigeon Hour gets close to this https://www.aaronbergman.net/podcast
+1, love Pigeon Hour
Omg 😊😊😊😊
I'd be happy to listen to conversations with interesting and articulate people who are low-key. I suspect that a major challenge will be finding these people. My general/vague understanding is that most (not all) people who are on podcasts (or who have any type of public image) tend to do a certain amount of self-promotion, and I predict that tendency to self-promote is negatively correlated with being low-key.
I've met a handful of people who are doing good work that don't appear to spend much effort on what I'll label as "image," but if you are searching for interesting and articulate people for a podcast it will be hard to find those people, exactly because they don't promote themselves.
Finding them should be easy, no? Just checking the employees of interesting orgs on LinkedIn.
Maybe convincing them will be harder.
Second this. I think it could be potentially really fun to listen to by focusing on the hardships of doing ops work, development etc. like grunt work that is super important but not glorifying. So it can be a bit like listening to ultra marathon runners talk about their run, how hard it was, major pains they encountered and how they overcome it. Kind of like celebrating shlep in EA. This way, one can also get more people to be excited and feel rewarded for doing hard and boring stuff, which some senior EAs have seemed to indicate we need more of, and less of "galaxy brain fun and wild ideas".