Has anyone else noticed anti-LGBT and specifically anti-trans sentiment in the EA and rationalist communities? I encountered this recently and it was bad enough that I deactivated my LessWrong account and quit the Dank EA Memes group on Facebook.
Quick Book Review. As we enter a period of frequent holiday get-togethers, I strongly recommend reading Priya Parker's The Art of Gathering. In it, Parker whisks you through the ways that you can make your gathering special. While hosting good gatherings for the utility they provide is likely not a top cause area, creating community around effective action seems indispensable. Plus, the book is available for five dollars on Amazon, so it seems likely to have massive upside ROI potential: https://www.amazon.com/Art-Gathering-How-Meet-Matters/dp/1594634920
Questioning the new "EA is funding constrained" narrative
I recently saw a presentation with a diagram showing how committed EA funding dropped by almost half with the collapse of FTX, based on these data compiled by 80k in 2022. Open Phil at the time had a $22.5 billion endowment and FTX's founders were collectively worth $16.5 billion.
I think that this narrative gives off the impression that EA causes (especially global health and development) are more funding-constrained than they really are. 80k's data excludes philanthropists that often make donations in an EA-aligned way, such as the Gates Foundation's global health program. As of 2022, Gates' endowment was worth $67.3 billion,[1] and global health and development accounted for more than half of its spending that year.[2] The Gates Foundation's global health program, at least, seems to make grants in a cost-effectiveness-driven way, so it arguably should count as EA funding.
1. ^
Gates Foundation on Wikipedia
2. ^
Gates Foundation's 2022 Annual Report
The Effective Ventures Foundation UK’s Full Accounts for Fiscal Year 2022 has been released via the UK companies house filings (August 30 2023 entry - it won't let me direct link the PDF).
* Important to note that as of June 2022 “EV UK is no longer the sole member of EV US and now operate as separate organizations but coordinate per an affiliation agreement (p11).”
* It’s noted that Open Philanthropy was, for the 2021/2022 fiscal year, the primary funder for the organization (p8).
* EVF (UK&US) had consolidated income of just over £138 million (as of June 2022). That’s a ~£95 million increase from 2021.
* Consolidated expenses for 2022 were ~ £79 million - an increase of £56 million from 2021 (still p8).
* By end of fiscal year consolidated net funds were just over £87 million of which £45.7 million were unrestricted.
* (p10) outlines EVF’s approach to risk management and mentions FTX collapse.
* A lot of boiler plate in this document so you may want to skip ahead to page 26 for more specific breakdowns
* EVF made grants totaling ~£50 million (to institutions and 826 individuals) an almost £42 million increase in one year (p27)
* A list of grant breakdowns (p28) ; a lot of recognizable organizations listed from AMF to BERI and ACE
* also a handful of orgs I do not recognize or vague groupings like “other EA organizations” for almost £3 million
* Expenses details (p30) main programs are (1) Core Activities (2) 80,000 Hours (3) Forethought and (4) Grant-making
* Expenses totaled £79 million for 2022 (a £65 million increase from 2021) which seems like a huge jump for just one year
* further expense details are on (p31-33) and tentatively show a £23.3 million jump between 2021 and 2022 [but the table line items are NOT the same across 2021/2022 so it’s hard to tell - if anyone can break this down better please do in the comments]
* We may now have a more accurate number of £1.6 million spent on marketing for What We Owe The Future (which i
Wytham Abbey soft-launched earlier this year with it's own team, but has now formally been added to EV's list of projects and is accepting workshop applications https://www.wythamabbey.org
Atlas Fellowship has announced it's shutting down its program - see full letter on their site: https://www.atlasfellowship.org Reasons listed for the decision are 1) funding landscape has changed 2) the programs were less impactful than expected and 3) some staff think they'll have more impact pursuing careers in AI safety.
One quick way to re-instill motivation for those working on the animal welfare of those of unknown sentience, It's useful to remember that the probability of 2 possibly sentient beings being sentient, even if they are the same species, is independent. They don't have to be either both sentient or not sentient, so you're not taking a 1 in 100 chance that your job is worthwhile. You're taking a 1-(0.99ⁿ) (where n is the number of times you save a life that has a 1 in 100 chance of being sentient) of your job being worthwhile.
I'm thinking about organising a seminar series on space and existential risk. Mostly because it's something I would really like to see. The webinar series would cover a wide range of topics:
* Asteroid Impacts
* Building International Collaborations
* Monitoring Nuclear Weapons Testing
* Monitoring Climate Change Impacts
* Planetary Protection from Mars Sample Return
* Space Colonisation
* Cosmic Threats (supernovae, gamma-ray bursts, solar flares)
* The Overview Effect
* Astrobiology and Longtermism
I think this would be an online webinar series. Would this be something people would be interested in?