This December is the last month unlimited Manifold Markets currency redemptions for donations are assured: https://manifoldmarkets.notion.site/The-New-Deal-for-Manifold-s-Charity-Program-1527421b89224370a30dc1c7820c23ec
Highly recommend redeeming donations this month since there are orders of magnitude more currency outstanding than can be donated in future months
Effective giving quick take for giving season
This is quite half-baked because I think my social circle contains not very many E2G folks, but I have a feeling that when EA suddenly came into a lot more funding and the word on the street was that we were “talent constrained, not funding constrained”, some people earning to give ended up pretty jerked around, or at least feeling that way. They may have picked jobs and life plans based on the earn to give model, where it would be years before the plans came to fruition, and in the middle, they lost status and attention from their community. There might have been an additional dynamic where people who took the advice the most seriously ended up deeply embedded in other professional communities, so heard about the switch later or found it harder to reconnect with the community and the new priorities.
I really don’t have an overall view on how bad all of this was, or if anyone should have done anything differently, but I do have a sense that EA has a bit of a feature of jerking people around like this, where priorities and advice change faster than the advice can be fully acted on. The world and the right priorities really do change, though; I’m not sure what should be done except to be clearer about all this, but I suspect it’s hard to properly convey “this seems like the absolute best thing in the world to do, also next year my view could be that it’s basically useless” even if you use those exact words. And maybe people have done this, or maybe it’s worth trying harder. Another approach would be something like insurance.
A frame I’ve been more interested in lately (definitely not original to me) is that earning to give is a kind of resilience / robustness-add for EA, where more donors just means better ability to withstand crazy events, even if in most worlds the small donors aren’t adding much in the way of impact. Not clear that that nets out, but “good in case of tail risk” seems like an important aspect.
There is still plenty of time to vote in the Donation Election. The group donation pot currently stands at around $30,000. You can nudge that towards the projects you think are most worthwhile (plus, the voting system is fun and might teach you something about your preferences).
Also- you should donate to the Donation Election fund if:
a) You want to encourage thinking about effective donations on the Forum.
b) You want to commit to donating in line with the Forum's preferences.
c) You'd like me to draw you one of these bad animals (or earn one of our other rewards):
NB: I can also draw these animals holding objects of your choice. Or wearing clothes. Anything is possible.
The Effective Ventures Foundation UK’s Full Accounts for Fiscal Year 2022 has been released via the UK companies house filings (August 30 2023 entry - it won't let me direct link the PDF).
* Important to note that as of June 2022 “EV UK is no longer the sole member of EV US and now operate as separate organizations but coordinate per an affiliation agreement (p11).”
* It’s noted that Open Philanthropy was, for the 2021/2022 fiscal year, the primary funder for the organization (p8).
* EVF (UK&US) had consolidated income of just over £138 million (as of June 2022). That’s a ~£95 million increase from 2021.
* Consolidated expenses for 2022 were ~ £79 million - an increase of £56 million from 2021 (still p8).
* By end of fiscal year consolidated net funds were just over £87 million of which £45.7 million were unrestricted.
* (p10) outlines EVF’s approach to risk management and mentions FTX collapse.
* A lot of boiler plate in this document so you may want to skip ahead to page 26 for more specific breakdowns
* EVF made grants totaling ~£50 million (to institutions and 826 individuals) an almost £42 million increase in one year (p27)
* A list of grant breakdowns (p28) ; a lot of recognizable organizations listed from AMF to BERI and ACE
* also a handful of orgs I do not recognize or vague groupings like “other EA organizations” for almost £3 million
* Expenses details (p30) main programs are (1) Core Activities (2) 80,000 Hours (3) Forethought and (4) Grant-making
* Expenses totaled £79 million for 2022 (a £65 million increase from 2021) which seems like a huge jump for just one year
* further expense details are on (p31-33) and tentatively show a £23.3 million jump between 2021 and 2022 [but the table line items are NOT the same across 2021/2022 so it’s hard to tell - if anyone can break this down better please do in the comments]
* We may now have a more accurate number of £1.6 million spent on marketing for What We Owe The Future (which i
I was finding it hard to keep track of all the different organizations posting about their marginal funding plans recently, so i made a simple spreadsheet:
Feel free to add any other EA orgs or fix errors or re-arrange everything or whatever.
I feel a bit confused about how much I should be donating.
1. On the one hand there’s just a straight forward case that donating could help many sentient beings to a greater degree than it helps me. On the other hand, donating 10% for me feels like it’s coming from a place of fitting in with the EA consensus, gaining a certain kind of status and feeling good rather than believing it’s the best thing for me to do.
2. I’m also confused about whether I’m already donating a substantial fraction of my income.
* I’m pretty confident that I’m taking at least a 10% pay-cut in my current role. If nothing else my salary right now is not adjusted for inflation which was ~8% last year so it feels like I’m at least underpaid by that amount (though it’s possible they were overpaying me before). Many of my friends earn more than twice as much as I do and I think if I negotiated hard for a 100% salary increase the board would likely comply.
* So how much of my lost salary should I consider to be a donation? I think numbers between 0% and 100% are plausible. -50% also isn’t insane to me as my salary does funge with other peoples donations to charities.
* One solution is that I should just negotiate for my salary from a non-altruistic perspective, and then decide how much I want to donate back to my organisation after that. This seems a bit inefficient though and I think we should be able to do better.
3. One reason I don’t donate ~50% of my salary is that I genuinely believe it’s more cost-effective for me to build runway than donate right now. I quite like the idea of discussing this with someone who strongly disagrees with me and I admire and see if they come round to my position. It feels a bit too easy to find reasons not to give, and I’m very aware of my own selfishness in many parts of my life.
Relevant to giving season: 80,000 Hours will soon be doing a public fundraising round! If you’d like to be notified when we launch, leave your email here. If you have any questions in the meantime, please contact email@example.com.
Load more (8/45)
The Happier Lives Institute have helped many people (including me) open their eyes to Subjective Wellbeing and perhaps even update us to the potential value of SWB. The recent heavy discussion (60+ comments) on their fundraising thread disheartened me. Although I agree with much of the criticism against them, the hammering they took felt at best rough and perhaps even unfair. I'm not sure exactly why I felt this way, but here are a few ideas.
* (High certainty) HLI have openly published their research and ideas, posted almost everything on the forum and engaged deeply with criticism which is amazing - more than perhaps any other org I have seen. This may (uncertain) have hurt them more than it has helped them.
* (High certainty) When other orgs are criticised or asked questions, they often don't reply at all, or get surprisingly little criticism for what I and many EAs might consider poor epistemics and defensiveness in their posts (for charity I'm not going to link to the handful I can think of). Why does HLI get such a hard time while others get a pass? Especially when HLI's funding is less than many of orgs that have not been scrutinised as much.
* (Low certainty) The degree of scrutiny and analysis of some development orgs in general like HLI seems to exceed that of AI orgs, Funding orgs and Community building orgs. This scrutiny has been intense- more than one amazing statistician has picked apart their analysis. This expert-level scrutiny is fantastic, I just wish it could be applied to other orgs as well. Very few EA orgs (at least that have been posted on the forum) produce full papers with publishable level deep statistical analysis like HLI have at least attempted to do. Does there need to be a "scrutiny rebalancing" of sorts. I would rather other orgs got more scrutiny, rather than development orgs getting less.
Other orgs might see threads like the HLI funding thread hammering and compare it with other threads where orgs are criticised and don't eng