I wanted to share this update from Good Ventures (Cari and Dustin’s philanthropy), which seems relevant to the EA community.
Tl;dr: “while we generally plan to continue increasing our grantmaking in our existing focus areas via our partner Open Philanthropy, we have decided to exit a handful of sub-causes (amounting to less than 5% of our annual grantmaking), and we are no longer planning to expand into new causes in the near term by default.”
A few follow-ups on this from an Open Phil perspective:
- I want to apologize to directly affected grantees (who've already been notified) for the negative surprise here, and for our part in not better anticipating it.
- While this represents a real update, we remain deeply aligned with Good Ventures (they’re expecting to continue to increase giving via OP over time), and grateful for how many of the diverse funding opportunities we’ve recommended that they’ve been willing to tackle.
- An example of a new potential focus area that OP staff had been interested in exploring that Good Ventures is not planning to fund is research on the potential moral patienthood of digital minds. If any readers are interested in funding opportunities in that space, please reach out.
- Good Ventures has told us they don’t plan to exit any overall focus areas in the near term. But this update is an important reminder that such a high degree of reliance on one funder (especially on the GCR side) represents a structural risk. I think it’s important to diversify funding in many of the fields Good Ventures currently funds, and that doing so could make the funding base more stable both directly (by diversifying funding sources) and indirectly (by lowering the time and energy costs to Good Ventures from being such a disproportionately large funder).
- Another implication of these changes is that going forward, OP will have a higher bar for recommending grants that could draw on limited Good Ventures bandwidth, and so our program staff will face more constraints in terms of what they’re able to fund. We always knew we weren’t funding every worthy thing out there, but that will be even more true going forward. Accordingly, we expect marginal opportunities for other funders to look stronger going forward.
- Historically, OP has been focused on finding enough outstanding giving opportunities to hit Good Ventures’ spending targets, with a long-term vision that once we had hit those targets, we’d expand our work to support other donors seeking to maximize their impact. We’d already gotten a lot closer to GV’s spending targets over the last couple of years, but this update has accelerated our timeline for investing more in partnerships and advising other philanthropists. If you’re interested, please consider applying or referring candidates to lead our new partnerships function. And if you happen to be a philanthropist looking for advice on how to invest >$1M/year in new cause areas, please get in touch.
First I feel like you are conflating 2 issues here. You start and finish by talking about PR, but in the middle you argue the important of the future I think it's important to separate these two issues to avoid confusion, I'll just discuss the PR angle
I disagree and think there's a smallish but significant risk of PR badness here. From my experience talking to even my highly educated friends who aren't into EA, they find it very strange that money is invested into researching the welfare of future AI minds at all and often flat out disagree that money should be spent on that. That indicates to me (weakly from anecdata) that there is at least some PR risk here.
I also think there are pretty straightforward framings like "millions poured into welfare of robot minds which don't Even exist yet" which could certainly be bad for PR. If I were anti EA I could write a pretty good hit piece about rich people in silicon valley prioritizing their digital mind AI hobby horse ahead of millions of real minds that are suffering right now.
What are your grounds for thinking that this has a almost insignificant chance of being "PR costly"?
I also didn't like this comment because it seemed unnecessarily arrogant, and also dismissive of the many working in areas not defunded, who I hope you would consider at least part of the heart of the wonderful EA intellectual ecosystem.
"defund form the heart of the intellectual community that is responsible for the vast majority of impact of this ecosystem,"
That said I probably do agree with this...
An EA community that does not consider whether the minds we aim to control have moral value seems to me like one that has pretty seriously lost its path"
But don't want to conflate that with the PR risk....