Hide table of contents

FTX's demise is understandably dominating EA discussion at the moment. Like everyone in the community, I’m primarily just really sad about the news. A lot of people have been hurt; and a lot of funding that could have been used to do a staggering amount of good is no longer available.

I’m also aware of the dangers of publishing reactions too quickly. Many of the facts about FTX are not clear. However, the major effect of recent events on EA funding seems fairly unambiguous: there is substantially less funding available to EA today than there was a week ago.

Accordingly, I think we can lay out some early food for thought from the last year or so; and hopefully start an evolving conversation around funding in EA, and effective giving more broadly, as we learn more.

(P.s. Much of what is below has been said before but it seems helpful to summarize it in one place. If it’s your work I’m drawing on, I hope you see this as validation of what you saw that others didn’t. I’ll link to you where I can!)

We need to aim for greater funding diversity and properly resource efforts to achieve this

Saying “we should diversify our funding” is almost cringingly obvious in the current circumstances. But a lot of the discussion about funding over the last year seemed not to acknowledge the risks when ~85% of expected funding is supplied by two pots of money; and when those pots are effectively reliant on the value of 2-3 highly volatile assets. Even when these issues were explicitly acknowledged, they didn’t seem to influence the way funding was forecasted sufficiently - discussions rarely seemed to consider a future where one or both sources of funding rapidly declined in value. We may find out in the coming months that grantees didn’t consider this risk sufficiently either, for example if they bought fixed assets with significant running costs.

Funding diversity is important not just to protect the overall amount of funding available to EA. It’s also vital to avoid a very small number of decision-makers having too much influence (even if they don’t want that level of influence in the first place). If we have more sources of funding and more decision-makers, it is likely to improve the overall quality of funding decisions and, critically, reduce the consequences for grantees if they are rejected by just one or two major funders. It might also reduce the dangers of money unduly influencing moral and philosophical views in a particular direction.

In short: it's still important to find new donors, even if the amount of existing funding is very large in expectation.

Of course, it is significantly easier to say we should diversify than actually to diversify. So what could we do to try to mitigate the risks of overly concentrated sources of funding?

Bring more donors at every level into EA

It seems wise to support organizations that can broaden the funding base of EA. Recently, there has been a move to prioritize securing major donors (and, in some extreme cases, to suggest that anything attracting small donors is a waste of time). Seeking more super wealthy donors is a sensible move (see below), but there is a reasonable case for seeking more donors in general at the same time. More donors in EA means more reliable, less volatile funding overall. Additionally, even if the amounts they give seem trivial by comparison, small donors can still make a difference, for example by funging dollars back to larger donors, who can then give more to things that are less popular, more technical or higher risk. 

To this end, we should celebrate, encourage and (most importantly) fund the wide range of projects that attract effective givers - pledge organizations (One for the World, where I am Executive Director; Giving What We Can); organizations targeting specific audiences (High Impact ProfessionalsHigh Impact Athletes); and national regranting organizations (there is a fairly comprehensive list here).

These organizations not only bring thousands of new donors into EA each year - they also simultaneously increase our chances of finding new major donors, whether those are earning to givers or, rarely, Ultra High Net Worths. Even ‘entry level’ approaches like the One for the World pledge have uncovered billionaire donor prospects, by exposing enough people to EA that even very rare events become more probable, while also attracting thousands of ‘grassroots’ givers.

I’m extremely excited that Open Philanthropy now has a dedicated pot of funding and programme officer for these sorts of organizations - but the risks of this space relying on a single funder are especially obvious at the moment. Almost all of these organizations have funding gaps right now and report that they have significant funding insecurity. You can directly improve the funding diversity and security of many of these organizations today. If you would like to donate to one or more of the above, please DM me.

(It would also be amazing if effective giving could be a more prominent part of EA Groups outreach, but I discussed this in some detail previously here. For some of those who made counterarguments there, the calculus may now have shifted.)

Seek more Ultra High Net Worth (e.g. billionaire) funders

I also think we need explicitly to target more Ultra High Net Worth (UHNW) donors. [UPDATE: for example, this guy.] 

One reaction to recent events might be to swear off these funders entirely. I’m really supportive of diversification of EA and keen to grow the movement, and understand that super rich figureheads may run counter to these goals. I’m also very conscious of the general reputational risks of being too closely associated any specific individual or industry, or with billionaires in general. 

However, it seems unlikely that EA cause areas could ever get the majority of their funds from small dollar donors, unless overall amount available decreased dramatically. Almost all funding obeys the Pareto principle - and even movements that have tried to prioritize small dollar donors still receive most of their funding from a few massive ones in reality. So I suspect that avoiding UHNW fundraising entirely would be net negative. (Side note: if there are counterexamples, I’d be very interested to learn about them in the comments.)

On one level, saying “recruit more massive donors” sounds pretty facile. The benefits are fairly self-evident - more individual UHNW donors means more funding in total and, critically, less reliance on any one person’s wealth - but the execution is hard. However, I think there is some value to having this as an explicit EA goal; and to committing resources to projects that can help attract UHNW donors.

We need to recognize that each individual UHNW funder is inherently risky if their wealth is bound up in a small number of companies, whose value can change very quickly. I have heard some suggestions that a lesson from FTX’s demise is to diversify the assets of existing major funders. But this often isn’t controllable - asking FTX founders to sell FTX and diversify their wealth not only could have crashed the price of the company, it also would have been a significant barrier to convincing them to donate. It’s also not always good in expectation - holding on to significant chunks of stock can massively increase the amount of capital you can deploy in the future. As a result, finding more individual sources of massive funding is a more practical route to reducing risk than trying to restructure the wealth of existing donors.

Our best shot at achieving this would seem to be celebrating and funding efforts that are likely to generate interest from new UHNW donors. These include projects like Generation Pledge (which explicitly targets billionaire inheritors), Founder’s PledgeGiveWell and Effective Giving. Each of these has the potential to bring new massive donors into the EA space - and, while most of them are adequately resourced, some have significant funding gaps or have small enough budgets that you can make a difference even as a small dollar donor.

Celebrate and encourage earning to give

A third reform I’d like to see is a refreshed focus on and celebration of earning to give. While FTX funding was still available, it’s understandable that earning to give seemed less impactful on the margin. But, in general, earn to givers are incredibly useful and impactful (and remained so throughout FTX’s rise and fall). They give amounts that can be extremely significant to small organizations. They can also funge not-insignificant funds back to major institutional donors if they donate to e.g. GiveWell charities; and they are often willing to fund things that are higher risk or less popular (e.g. meta charity start-ups). They also have a tiny chance of turning into mega donors, as Sam Bankman-Fried was for a period.

In general, I want to be forward-looking in this post. However, I do want to draw attention retrospectively to the apparent marginalization of earn to givers in recent years. I spent time with some earning to give people at EAG London 2021 and they all reported that they felt completely undervalued at the conference. One person, who is both an earn to giver and sits on the Board of more than one EA organization, had their initial application to the conference rejected - and the earning to give meetup was almost exclusively people considering doing it rather than people actually doing it, which suggests it’s possible that earn to givers were systematically not invited or rejected. 

Even if earning to give becomes less valuable on the margin, and even as our priorities have evolved to give more weight to direct work, we have to acknowledge how profoundly virtuous earning to give is. It was really gut-wrenching to see people who acted on EA’s best advice in its early years, and who have made huge sacrifices in their own material wealth to help other people, feeling like there wasn’t a place for them at one of EA’s biggest conferences and generally feeling like ‘yesterday’s news’ when talking to delegates.

To this end, I was delighted to see the founding of High Impact Professionals to provide a community for those earning to give; and discussions with Open Philanthropy are ongoing about how best to nurture the effective giving space with some dedicated staff, an annual retreat and so on. 

I would love us to refresh our commitment to celebrate those who earn to give, whether that’s in systemic ways, like representation at EA events, or just in our personal interactions. They are people who work incredibly hard and then voluntarily give away hundreds of thousands, and sometimes millions, of dollars that they could spend on themselves.

We should retain awareness around optics, in good times and bad

It seems fair to say that the massive opportunity of a large influx of funding led us to take optics less seriously in the last year. This seems particularly unfortunate now that the source of funding has dried up, and we are seeing further reputational damage to EA as the FTX saga unfolds.

I don’t want to suggest at all that these were easy decisions at the time. It doesn’t necessarily follow that disbursing funding quickly and (arguably) somewhat liberally was a bad idea in expectation, just because FTX has now failed.

I do think, however, that we should consider carefully steps we might take next time to mitigate reputational risks to EA. Again, I want to emphasize that I don’t think it’s straightforward to weigh up the benefits of disbursing funding really quickly versus using a slower, more structured approach. But it might seem a good rule of thumb to err on the side of registering charitable foundations to disburse grants; to avoid putting single donors on a huge pedestal inside and outside the community; and to try to avoid major sources of funding being publicly tied to any one individual or industry.

These sorts of reforms might avoid some of the issues we’re now grappling with, where FTX grantees could plausibly be subject to clawbacks of their funding in a way that wouldn’t apply if FTX had made grants from a charitable vehicle; and where FTX’s failure might cause significant reputational damage to EA by association. If calling a fund ‘The Future Fund’ rather than ‘FTX Future Fund’ helps mitigate this risk, it seems worth doing. 

To make this point a bit facetiously, Open Philanthropy Project isn’t called 'The Facebook-Asana Fund' and hasn’t seen its reputation affected by scandals at Meta, for example. It's also noticeable that Dustin Moskovitz has received relatively little adulation and exposure inside and outside EA since becoming a mega donor.

I find it notable that very few of the UHNW fundraisers above (Generation Pledge most of all, but also Founder’s Pledge, GiveWell etc.) mention Effective Altruism almost at all on their websites. Staff at several of them have told me directly that this is because of EA’s mixed reputation. This seems like a further reason to consider optics more carefully, especially if we are able to secure significant influxes of funding in the future - if we seem to have gone off-piste in utilizing one funder, we might inadvertently discourage other funders from getting involved [UPDATE: again, for example this guy.]

We need to retain nuance in discussions around funding, in good times and bad

When FTX first announced the FTX Future Fund, it represented a major change in Effective Altruism. The wealth of FTX and Alameda co-founders was estimated to make up 35% of all funding available to EA. FTX was expected to contribute ~28-40% of all EA longtermist grants  in 2022. 

It therefore made sense to update our views in light of these significant changes.

What seemed to happen in practice, though, is that nuanced discussions on this theme were received and replicated in unnuanced ways (sometimes despite the best efforts of the original authors). For example, I saw and heard the following at various times and in various places:

Nuanced viewUnnuanced view
EA has a funding overhang - that is, the amount of money available to EA is growing more quickly than the number of highly-engaged EAs - and this should change our thinking on the margin

EA is overfunded (and, implicitly, will remain so)

or

Donating isn’t a high impact thing to do any more 

or

We should discourage people from donating and/or avoid promoting donating

or

If you’re involved in direct work, it’s better on the margin to spend money on yourself than to donate it

Because of the above, Earning to Give looks less impactful on the marginEarning to Give is no longer a high impact career path and/or should be discouraged
Longtermist community building and AI Safety have significant funds available to them, but many other cause areas do not, and available funding varies a lot depending on size and stage of organizationEA is overfunded (and will remain so) etc.

EA has a lot more funding available to it in expectation than it did before

or

GiveWell may need to roll over funding in the current fiscal year

EA cause areas such as global poverty are now overfunded

These are just a handful of examples. In total, though, there were repeated examples of unnuanced takes, in Forum posts and comments, at EA conferences and in general conversation.

The lesson here seems fairly simple - we need to remember that epistemic humility is a core part of EA and, as far as is practical, we need to strive to couch our opinions (or the opinions of others) with appropriate levels of nuance and context. Broadly speaking, tropes and memes like "EA is overfunded" should be used with a lot of care, so that oversimplified ideas don’t take hold.

I don’t intend to draw attention to this in a mean-spirited way - I think a lot of people were arguing in good faith, and also the fact that these statements are no longer true doesn't necessarily mean they were definitely false at the time. I more want to emphasize that we need to have this level of nuance front and center if we secure other major funders in the future.

Things I’ve missed

As I said at the start, this is just a collection of personal thoughts - I hope it can be the start of an evolving conversation about what the FTX saga teaches us. I’m also sure I’ve missed things, so please do jump into the comments if you think I’m wide of the mark or should add extra sections to this post.

Comments26
Sorted by Click to highlight new comments since:

Strong agree on earning to give. It's also just the only EA-relevant career path that's available to a lot of later-career people who might want to support the movement, but get exasperated when every 80k profile begins 'Now that you've  graduated magna cum laude from your elite university...'

Big +1 on this - OFTW does a lot of outreach at big corporations and at e.g. business schools, and people there are often not receptive to changing their career plans; but they can donate and make a huge difference.

Nice post. I really liked your point around having a nuanced view. I remember various conversations over the years on broad vs. deep, as if it's that simple. The reality is that the broader you go, the more likely you are to find the deep people.

We should retain awareness around optics, in good times and bad

I'd like to push back on this frame a bit. I almost never want to be thinking about "optics" as a category, but instead to focus on mitigating specific risks, some of which might be reputational.

See https://lesswrong.substack.com/p/pr-is-corrosive-reputation-is-not for a more in-depth explanation that I tend to agree with.

I don't mean to suggest never worrying about "optics" but I think a few of the things you cited in that category are miscategorized:

err on the side of registering charitable foundations to disburse grants; to avoid putting single donors on a huge pedestal inside and outside the community; and to try to avoid major sources of funding being publicly tied to any one individual or industry.

Registering a charitable entity to disburse grants mostly makes sense for legal reasons; avoiding funding sources being too concentrated is a good risk-mitigation strategy. We should do both of these but not primarily for optics reasons.

I agree with you that we should avoid putting single donors on a pedestal, and this is the one that makes most sense to do for "optics" reasons; but it's also the most nuanced one, because we also want to incentivize such people to exist within the ecosystem, and so shouldn't pull back too hard from giving status to our heroes. One thing that I would like to be better about along this axis is identifying heroes who don't self-promote. SBF was doing a lot of self-promotion. A well-functioning movement wouldn't require self-promotion.

This is interesting and I agree with much of it.

I think two extra things:

  1. I think optics are worth adding in to your risk calculations, even if e.g. you think the principle risk is legal
  2. I didn't mention in the OP the most egregious examples of bad optics but I think some exist - I would argue flying people to the Bahamas to cowork has dreadful optics and that might be a strong argument against doing it

Jack - excellent posts. I was thinking along similar lines.

Regarding recruiting billionaires (ultra high net worth/UHNW individuals) -- there are allegedly about 3,300 billionaires in the world, including about 1,000 in the US, 400 in China, etc. Many of them made their wealth in relatively traditional industries, most are over age 50, and many like to keep a pretty low profile. 

I get the sense that EA donor recruitment has often focused on the nearest, lowest-hanging fruit -- young tech billionaires, who tend to live in the Bay Area, NY, London, etc -- where EAs tend to live. 

However, it might be helpful to pivot a bit towards the older, lower-profile, more traditional UHNWs. It's easy to young, bright, idealistic EAs to stereotype elderly oil magnates or middle-aged lords of retail empires as unlikely to be sympathetic to EA causes. But these folks usually made their money by being smart, open-minded, practical, and effective, they're often looking to give back to the world somehow, and they're often frustrated with the failures of most traditional charities.

Thanks Geoffrey. I would be supportive of efforts aimed at this demographic. My immediate thought is that we might face more resistance from baked in attitudes e.g. 'giving should be local' - but I have been pleasantly surprised by how many people in their forties and fifties have started donations after One for the World corporate talks.

Also worth noting that some cause areas like animal welfare have fared better with this 'old money' demographic.

Sorry if I missed this in other comments, but one question I have is if there are ways for small donors to support projects or individuals in the short term who have been thrown into uncertainty by the FTX collapse (such as people who were planning on the assumption that they would be receiving a regrant). I suppose it would be possible to donate to Nonlinear's emergency funding pot, or just to something like the EAIF / LTFF / SFF.

But I'm imagining that a major bottleneck on supporting these affected projects is just having capacity  to evaluate them all. So I wonder about some kind of initiative where affected projects can choose to put some details on a public register/spreadsheet (e.g. a description of the project, how they've been affected, what amount of funding they're looking for, contact details). Then small donors can look through the register and evaluate projects which fit their areas of interest / experience, and reach out to them individually. It could be a living spreadsheet where entries are updated if their plans change or they receive funding. And maybe there could be some way for donors to coordinate around funding particular projects that they individually each donor couldn't afford to fund, and which wouldn't run without some threshold amount. E.g. donors themselves could flag that they'd consider pitching in on some project if others were also interested.

A more sophisticated version of this could involve small donors putting donations into some kind of escrow managed by a trusted party that donates on people's behalf, and that trusted party shares donors on information about projects affected by FTX. That would help maintain some privacy / anonymity if some projects would prefer that, but at administrative cost. I'd guess this idea is too much work given the time-sensitivity of everything.

An 80-20 version is just to set up a form similar to Nonlinear's, but which feeds into a database which everyone can see, for projects happy to publicly share that they are seeking shortish-term funding to stay afloat / make good on their plans. Then small donors can reach out at their discretion. If this worked, then it might be a way to help 'funge' not just the money but also the time of grant evaluators at grantmaking orgs (and similar) which is spent evaluating small projects. It could also be a chance to support projects that you feel especially strongly about (and suspect that major grant evaluators won't share your level of interest).

I'm not sure how to feel about this idea overall. In particular, I feel misgivings about the public and uncoordinated nature of the whole thing, and also about the fact that typically it's a better division of labour for small donors to follow the recommendations of experienced grant investigators/evaluators. Decisions about who to fund, especially in times like these, are often very difficult and sensitive, and I worry about weird dynamics if they're made public.

Curious about people's thoughts, and I'd be happy to make this a shortform or post in the effective giving sub-forum if that seems useful.

I think is a good idea and would encourage you to post it on the sub forum

I think this is a great post, efficiently summarizing some of the most important takeaways from recent events.

I think this claim is especially important: 

"It’s also vital to avoid a very small number of decision-makers having too much influence (even if they don’t want that level of influence in the first place). If we have more sources of funding and more decision-makers, it is likely to improve the overall quality of funding decisions and, critically, reduce the consequences for grantees if they are rejected by just one or two major funders."

Here's a sketchy idea in that vein for further consideration. One additional way to avoid extremely wealthy donors having too much influence is to try to insist that UHNW donors subject their giving to democratic checks on their decision-making from other EAs. For instance, what if taking a Giving What We Can pledge entitled you to a vote of some kind on certain fund disbursements or other decisions? What if Giving What We Can pledgers could put forward "shareholder proposals" on strategic decisions (subject to getting fifty signatures, say) at EA orgs, which other pledgers could then vote on? (Not necessarily just at GWWC) Obviously there are issues: 

  • voters may not be the epistemic peers of grantmaking experts / EA organization employees
  • voters may not be the epistemic peers of the UHNW donors themselves who have more reputational stake in ensuring their donations go well
  • UHNW donors have a lot of bargaining power when dealing with EA institutions and few incentives to open themselves up to democratic checks on their decision-making 
  • determining who gets to vote is hard
  • some decisions need to be made quickly
  • sometimes there are infohazards 

But there are advantages too, and I expect that often they outweigh the disadvantages:

  • wisdom of crowds
  • diversified incentives
  • democracy is a great look

Here's a sketchy idea in that vein for further consideration. One additional way to avoid extremely wealthy donors having too much influence is to try to insist that UHNW donors subject their giving to democratic checks on their decision-making from other EAs.

Fwiw, if I were a UHNW individual (which I am not, to be clear), this would make me much less receptive to EA giving and would plausibly put me off entirely. I would guess this is more costly than it's worth?

[anonymous]7
4
1

While I won't necessarily endorse your specific governance proposals here since I think it warrants serious thought about a good strategy, I like your goals and I wholeheartedly agree that EA needs to consider the impact of letting a small group of UHNW individuals control the direction of the movement. I also agree that the OP is excellent, and something I've been scrolling looking for here and in the EA subreddit hoping to find someone taking a harder look at the issue.

If a person were really on board with EA principles they should be willing to admit that their own judgement is fallible, so much so that it would be good to relinquish control of the money they're giving away to a larger, more diverse group of people. Certainly the funder could decide on the larger goals (climate change  vs. AI safety, etc.), but I find myself questioning the motives of people who can't give up a large amount of control. 

Was SBF legitimately on board with EA or was he doing it to launder his image? We may never know for sure, but there's a long history of billionaires doing exactly that through charitable giving. From Carnegie to the Sacklers, and I suspect even the recent announcement from Bezos, this is common practice among UHNW folks. 

We as a community need to realize the danger this poses to the movement. Already, there is a negative perception of EA due to the embrace, and sometimes outright worship, of charismatic billionaires. Billionaires who do not live the values that EA is supposed to be pushing: epistemic humility, collaboration, and the belief that every person's interests deserve equal weight. The acceptance from the community of billionaires like Elon Musk and Peter Thiel jump out at me as giant red flags.

I will remain quite skeptical of any UHNW pledges that don't include the following:

  1. A transfer of a substantial amount of the pledge to a charitable organization in the short term, and a structured plan for how and when the balance of the pledge will be transferred. I understand that for people like SBF this wasn't possible because it wasn't liquid, but that just indicates that we should consider the pledge as unlikely to actually come to fruition. As the OP noted, we shouldn't consider a big pledge that's built on a highly volatile asset to be money in the bank.
  2. The receiving organization gains control over how the funds are dispersed, and the funder does not have control over leadership. They may specify a general goal for the money, but once pledged the funder should no longer have outsized impact on how the money is dispersed.

Pledges that don't follow this might certainly still be overall good uses of money and count as good philanthropy, but I think we ought to push that they aren't doing EA. If we really believe in the principles of EA, then we should hold our most visible funders to them.

Perhaps there could be two forms of engagement - one that is more transactional, and one that involves the person more as a figurehead. If they seem like a problematic figurehead, EA could try in some way to keep them more at arm's length. I guess this has already happened with Elon a bit, as he's not publicly associated with AI but is clearly EA-aligned.

I think there is at least one counterexample to your generally negative opinion of mega donors, btw, which is Dustin and Cari. I don't know them at all, so I guess this could age badly, but I have been thinking recently that we seem to have been exceptionally lucky with them. Open Phil is pretty close to best practice in a lot of ways.

I would also push back on your point 2, in that I actually think it's important to have leadership that has the confidence of both the funder and e.g. grantees. I don't think that's incompatible with the donor choosing the leadership.

[anonymous]4
3
1

Dustin and Cari at Open Phil may be exceptions, I have zero inside knowledge about them. Assuming you're right and they're paragons of EA-ness, which would be quite laudable, I see that as the exception that proves the rule. Maybe they don't need to give up the reins because they live the values of EA so well, but that isn't true of most people, and there's no reason to think that'd be true of most UHNW people.

EA hasn't pushed Musk and Thiel away nearly strongly enough for me. I know EA isn't a top down movement, but there are individuals with lots of EA credibility who can and should be making it more clear that what those two are doing isn't EA.

On my point 2, I'll admit I don't have a clear solution in mind. There needs to be a way to ensure there are good people in charge who will apply EA principles toward the organization's goals, and maybe the funder can have some initial influence. However, I'm highly suspicious of people who got billions and now claim to want to give it away but only if they can control it. Concentration of power is dangerous in every other aspect of society, I think it's obvious that EA is no exception. If someone truly believes in the principles of EA, then they must be willing to at least dilute their control considerably.

From my basic understanding of Open Phil, it does seem like Dustin & Cari have given up the reins to a large extent? Open Phil has hired lots of staff who are making the granting decisions, although maybe Dustin & Cari have a large influence over the cause areas.

It does indeed. I believe Dustin is on record saying that if he disagreed with Open Phil he would likely defer to their decision

I agree that your two points are the easier ones to implement (and hence I agree-voted), but I do believe democracy among a large group of stakeholders has to be baked in.

Thanks for this Zachary.

This is an interesting idea and I think should be discussed in some detail.

I am interested though in the trade offs between better governance and the sort of governance that might stop people giving at all.

So, for examples, I saw a good post saying that a reform could be "anyone asked to join a new funding vehicle could demand and audit and, if the funder refuses the audit, they should refuse to join and criticise it publicly and discourage other people from joining."

That seems very likely to stop FTX recurring; but also very likely to stop any UHNW investment in EA directly. So the question is 'what governance hurdles decrease risk but don't constitute a total barrier to entry?'

I wonder if submitting capital to your proposal seems a bit too much like the latter.

(Incidentally, I realise that asking 'what might a bad actor agree to?' is a slippery slope when deciding on what checks and balances to employ - but I think things like 'mega donors have to have an independent Board with financial and governance expertise, and register a charitable vehicle' is possibly a better balance than 'UHNWs need to let the crowd vet their giving decisions.')

So the question is 'what governance hurdles decrease risk but don't constitute a total barrier to entry?'

I agree. There are probably some kinds of democratic checks that honest UHNW individuals don't mind, but have relatively big improvements for epistemics and community risk. Perhaps there are ways to add incentives for agreeing to audits or democratic checks? It seems like SBF's reputation as a businessman  benefited somewhat from his association with EA (I am not too confident in this claim). Perhaps offering some kind of "Super Effective Philanthropist" title/prize/trophy to particular UHNW donors that agree to subject their donations to democratic checks or financial audits might be an incentive? (I'm pretty skeptical, but unsure.) I'd like to do some more creative thinking here.

I wonder if submitting capital to your proposal seems a bit too much like the latter.

Probably.

I really like this post and I think it's now my favorite post so far on the recent collapse of FTX.

Many recent posts on this subject have focused on topics such as Sam Bankman Fried's character, what happened at FTX and how it reflects on EA as a whole.

While these are interesting subjects, I got the impression that a lot of the posts were too backward-looking and not constructive enough.

I was looking for a post that was more reflective and less sensational and focused on what we can learn from the experience and how we can adjust the strategy of EA going forward and I think this post meets these criteria better than most of the previous posts.

Thanks Stephen, I really appreciate this feedback

Thanks for writing this! I'm inclined to agree with a lot of it.

I am cautious about over-updating on the importance of earning to give. Naively speaking, (longtermist) EA's NPV has crashed by ~50% (maybe more since Open Phil's investments went down), so (very crudely, assuming log returns to the overall portfolio) earning to give is looking roughly twice as valuable in money terms, maybe more. How many people are in the threshold where this flips the decision on whether ETG is the right move for them? My guess is actually not a ton, especially since I think the income where ETG makes sense is still pretty high (maybe more like $500k than $100k — though that's a super rough guess).

That said, there may be there are other reasons EA has been underrating (and continues to underrate) ETG, like the benefits of having a diversity of donors. Especially when supporting more public-facing or policy-oriented projects, this really does just seem like a big deal. A rough way of modeling this is that the legitimacy / diversity of a source of funding can act like a multiplier on the amount of money, where funding pooled from many small donors often does best. The Longtermism Fund is a cool example of this imo.

Another thing that has changed since the days when ETG was a much more widely applicable recommendation is that fundraising might be more feasible, because there are more impressive people / projects / track records to point to. So the potential audience of HNWIs interested in effective giving has plausibly grown quite a bit.

Thanks for writing this post! I wholeheartedly agree for the most part, and think that I personally have underweighted earning-to-give in the past (though it's obviously hard to disentangle this from hindsight bias, and there is the unfortunate dynamic that strong quantitative skillset both select well for high ETG potential and high AI safety potential). I particularly think I held status quo bias re the stability of the funding and underweighted the lack of diversity.

I also appreciate the fairly even handed tone, and effort to avoid hindsight bias or "I told you so"s

Thanks Neel, and especially for your honest self reflection. I'm not confident that FTX failing means it was likely to fail in expectation, so I think we need to be careful about saying "because this failed it was wrong ever to think it wouldn't, and any argument built on that premise was stupid."

I'm also pretty worried about EAs tearing into each other ATM without the full facts (or even with them, tbh). So I was aiming for the tone you mention, and I hope we can see more of this in the discourse (albeit that many people are legitimately very, very angry at the moment).

Thanks Jack for the structured way of sharing your thoughts on this matter. I'm pretty new in the EA community (<1y) so it helps me get a better understanding on what's going on.

As a managing director of a local giving org (Doneer Effectief) I fully agree with your thoughts, espacially aiming greater funding diversity. I try to do so, but as you mentioned, we're very dependent of the EA Infrastructure Fund right now. I'm very open to suggestions to diversify.

Curated and popular this week
Relevant opportunities