London folks - I'm going to be running the EA Taskmaster game again at the AIM office on the afternoon of Sunday 8th September.
It's a fun, slightly geeky, way to spend a Sunday afternoon. Check out last year's list of tasks for a flavour of what's in store 👀
Sign up here
(Wee bit late in properly advertising so please do spread the word!)
https://projects.propublica.org/nonprofits/ is a great American nonprofit resource:
New Incentives in particular seems poised to spend much more after large ~Givewell cash grants
I'm concerned about the new terms of service for Giving What We Can, which will go into effect after August 31, 2024:
This is a significant departure from the Effective Ventures' TOS (GWWC is spinning out of EV), which has users grant EV an unlimited but non-exclusive license to use feedback or suggestions they send, while retaining the right to do anything with it themselves. I've previously talked to GWWC staff about my ideas to help people give effectively, like a donation decision worksheet that I made. If this provision goes into effect, it would deter me from sharing my suggestions with GWWC in the future because I would risk losing the right to disseminate or continue developing those ideas or materials myself.
For years I've tried to succinctly formulate my issue with Effectve Altruism, this is the best I've done yet:
Ethical Humility: What Cellular Automata have to say about ruthless introspection, Effective Altruism, and annoyed girlfriends.
I write this to my younger self who was involved in Effective Altruism and any young people in it now.
08/02/2024 12:01am
Infinite complexity is always right around the corner. While this has been said in many ways and discovered many times, none are as concise or as simple as Stephen Wolfram’s Cellular Automaton Rule 30. Rule 30 beautifully illustrates how infinite complexity quickly emerges from simplicity and is in some ways deeper than even physics.
Rule 30 is a list of rules for transforming sets of 3 black-or-white boxes. If applied repeatedly you get a seemingly random pattern.
My version of this is what you could call the “annoyed girlfriend effect”, where a project always takes longer than you plan when hidden subproblems show up, and “just one more minute” turns into the whole evening. Put more directly, even seemingly simple problems can have huge hidden complexities.
Now when I first looked at these white and black blocks, I was underwhelmed. After all the hype I’d heard about Cellular Automata, this was it? But after thinking about if for a while, carrying this idea with me, and taking it to its natural conclusions, I realized there is quite a lot that these unassuming boxes can say.
The Woke movement for example, is based on the idea that all evil stems from the oppression of minorities. It follows that we must eradicate oppression at all costs, resulting in dangerous restrictive policies like the Canadian Bill C-16. The fundamental issue with the movement is its ideological homogeneity, meaning its unwillingness to recognize the complexity of the issue of oppression, and not account for its own ignorance. The roads to tyranny are numerous, foggy and one-way. If you consider all countries across space and
A while ago -- 2017, maybe? -- I remember attending EA Global in San Francisco, where Will MacAskill gave as either the keynote or the closing talk an address on the theme "Keep EA Weird". Do people still support this ideal? I notice that GoodVentures recently stopped funding some "weirder" cause areas, for instance.
It's at least possible to me that as EA has gotten bigger, some of the "weird" stuff has been pushed to the margins more and that's correct, but I'm not sure I've seen a detailed discussion of this, at least when it comes to cause areas rather than debates about polyamory and the like.
I run some online book clubs, some of which are explicitly EA and some of which are EA-adjacent: one on China as it relates to EA, one on professional development for EAs, and one on animal rights/welfare/advocacy. I don't like self-promoting, but I figure I should post this at least once on the EA Forum so that people can find it if the search for "book club" or "reading group." Details, including links for joining each of the book clubs, are in this Google Doc.
I want to emphasize that this isn't funded through an organization, I'm not trying to get emails to put on a newsletter, and I'm not selling an online course or push people to buy a product. This is literally just online book clubs: we vote on books and have video chats to talk about books.
Here are some upcoming discussions, with links for the events:
* August 14, The Culture Map: Breaking Through the Invisible Boundaries of Global Business. https://calendar.app.google/WY6LocYTX4WfCjAw5
* August 18, China: The Bubble That Never Pops. https://calendar.app.google/oUkTYWLg29mAK1xH9
* August 24, Dialogues on Ethical Vegetarianism. https://lu.ma/xuascqt5
* September 21, How Asia Works: Success and Failure in the World's Most Dynamic Region. https://calendar.app.google/TWWa2yLeKNEupoiaA
* September 22, The Scout Mindset: Why Some People See Things Clearly and Others Don't. https://calendar.app.google/SEtCiaoQw5ZmArhS6
* September 28, The Emotional Lives of Animals: A Leading Scientist Explores Animal Joy, Sorrow, and Empathy - and Why They Matter. https://lu.ma/ng492gwf
If there is interest, I'd be open to organizing/coordinating some kind of a "core EA books" reading group, with looks like What We Owe the Future, Scout Mindset, Doing Good Better, Animal Liberation, Poor Economics, etc.
I've been reviewing some old Forum posts for an upcoming post I'm writing, and incidentally came across this by Howie Lempel for noticing in what spirit you're engaging with someone's ideas:
I felt pretty called out :P
To be fair, I think the latter is sometimes a reasonable persuasive tactic, and it's fine to put yourself in a teaching role rather than a learning role if that's your endorsed intention and the other party is on board. But the value of this quote to me is that it successfully highlights how easily we can tell ourselves we're being intellectually curious, when we're actually doing something else.
I'm extremely excited that EAGxIndia 2024 is confirmed for October 19–20 in Bengaluru! The team will post a full forum post with more details in the coming days, but I wanted a quick note to get out immediately so people can begin considering travel plans. You can sign up to be notified about admissions opening, or to express interest in presenting, via the forms linked on the event page:
https://www.effectivealtruism.org/ea-global/events/eagxindia-2024
Hope to see many of you there!!