Hey there, I'm Austin, currently running https://manifund.org. Always happy to meet people; reach out at akrolsmir@gmail.com!
There are maybe 30 to 60 people in the world doing AI safety grantmaking, collectively directing hundreds of millions of dollars a year. Soon, there will be >$1B being directed per year, and potentially multiple billions.
I like this framing for the botecs it encourages!
Currently it seems like each grantmaker is (on average) responsible for ~$10m/y. One question I think about sometimes: how will # of grantmakers scale as more $ go towards AI safety funding? If funding is eg 3x'ing year-over-year, it's unclear whether we're currently training up that number of grantmakers.
Another question might be: what is a good ratio of # of grantmakers to # of direct work? I'd ballpark there to be ~1000 fulltime AIS direct workers; does a 20:1 ratio seem high, low, or just right?
I'd be curious to look at comparisons for scaled funding ecosystems for a reference class; I'm primarily thinking VCs & angels, but perhaps others eg academic funding are also appropriate.
It's a separate event run by CEA, which, in contrast to EAG, is much smaller and just for leaders in the field of xrisk. (I haven't been, but my wife attended this 2026 edition)
See also https://forum.effectivealtruism.org/posts/WLZabqQGCd2joZpxR/summit-on-existential-security-2023
I'm hiring for a variety of roles, which are mostly operational/community-shaped:
Hm, as a "helpful" react-er, I was trying to communicate both "thanks for engaging" and "I have seen this"; I recognize that it's hard for org leaders to weigh in on things in detail, so I simultaneously appreciate Zach saying anything at all, and wish he or someone else could elaborate (as, I suppose, I and Oli replied with at the time below).
Mostly I wasn't thinking that much about norms for reacts, idk
Thanks for the post! It seems like CEA and EA Funds are the only entities left housed under EV (per the EV website); if that's the case, why bother spinning out at all?
To be clear, "10 new OpenPhils" is trying to convey like, a gestalt or a vibe; how I expect the feeling of working within EA causes to change, rather than a rigorous point estimate
Though, I'd be willing to bet at even odds, something like "yearly EA giving exceeds $10B by end of 2031", which is about 10x the largest year per https://forum.effectivealtruism.org/posts/NWHb4nsnXRxDDFGLy/historical-ea-funding-data-2025-update.
Some factors that could raise giving estimates:
Also, the Anthropic situation seems like it'll be different than Dustin in that the number of individual donors ("principals") goes up a lot - which I'm guessing leads to more grants at smaller sizes, rather than OpenPhil's (relatively) few, giant grants
Appreciate the shoutout! Some thoughts:
Very much agreed, though I'm guilty for not having done this myself; hope to fix this soon!
Two other donation writeups I really liked: