Open Phil’s in-house legal team continues to keep an eye on the developments in the FTX bankruptcy case. Some resources we’ve put together for charities and other altruistic enterprises affected by the case can be found here

Last week the FTX Examiner released the Examiner’s Report. For context, on March 20 of this year, the FTX bankruptcy court appointed an independent examiner to review a wide swath of the various investigations into FTX and compile findings and make recommendations for additional investigations.

One notable finding that may be of interest to this audience is on page 165 of the Report (p. 180 of the pdf linked above). It reads: 

S&C [1] provided to the Examiner a list of over 1,200 charitable donations made by the FTX Group. S&C initially prioritized recovery of the largest charitable donations before turning to the next-largest group of donations and, finally, working with Landis[2] to recover funds from recipients of smaller donations. S&C concluded that, for recipients of the smallest value donations, any potential recoveries would likely be outweighed by the costs of further action. 

Without engaging in litigation, the Debtors have collected about $70 million from over 50 non-profits that received FTX Group-funded donations. The Debtors continue to assess possible steps to recover charitable contributions. 

I am aware of many relatively small-dollar grantees (<$50k) that have received clawback letters, and I’m not sure where S&C/Landis’ cutoff is for “smallest value donations.” So I don’t have any well-calibrated thoughts on whether and how much to update based on this finding. If you are working with a bankruptcy attorney after receiving a clawback letter, it could be worth bringing this to their attention. 

Before you comment in response to this post, I would urge you to assume that lawyers for the FTX Group will see your comments. 


 

  1. ^

    S&C is Sullivan and Cromwell, the law firm that the FTX estate has retained to represent it in the bankruptcy proceedings, including in pursuing clawback claims.

  2. ^

    Landis is a smaller law firm that is working with S&C and FTX on pursuing small-value clawback claims.

83

0
0
2

Reactions

0
0
2
Comments2


Sorted by Click to highlight new comments since:

Before you comment in response to this post, I would urge you to assume that lawyers for the FTX Group will see your comments.

In that case, I suggest they consider taking a look at the Earning to Give part of 80000 hours website

Curated and popular this week
abrahamrowe
 ·  · 9m read
 · 
This is a Draft Amnesty Week draft. It may not be polished, up to my usual standards, fully thought through, or fully fact-checked.  Commenting and feedback guidelines:  I'm posting this to get it out there. I'd love to see comments that take the ideas forward, but criticism of my argument won't be as useful at this time, in part because I won't do any further work on it. This is a post I drafted in November 2023, then updated for an hour in March 2025. I don’t think I’ll ever finish it so I am just leaving it in this draft form for draft amnesty week (I know I'm late). I don’t think it is particularly well calibrated, but mainly just makes a bunch of points that I haven’t seen assembled elsewhere. Please take it as extremely low-confidence and there being a low-likelihood of this post describing these dynamics perfectly. I’ve worked at both EA charities and non-EA charities, and the EA funding landscape is unlike any other I’ve ever been in. This can be good — funders are often willing to take high-risk, high-reward bets on projects that might otherwise never get funded, and the amount of friction for getting funding is significantly lower. But, there is an orientation toward funders (and in particular staff at some major funders), that seems extremely unusual for charitable communities: a high degree of deference to their opinions. As a reference, most other charitable communities I’ve worked in have viewed funders in a much more mixed light. Engaging with them is necessary, yes, but usually funders (including large, thoughtful foundations like Open Philanthropy) are viewed as… an unaligned third party who is instrumentally useful to your organization, but whose opinions on your work should hold relatively little or no weight, given that they are a non-expert on the direct work, and often have bad ideas about how to do what you are doing. I think there are many good reasons to take funders’ perspectives seriously, and I mostly won’t cover these here. But, to
Jim Chapman
 ·  · 12m read
 · 
By Jim Chapman, Linkedin. TL;DR: In 2023, I was a 57-year-old urban planning consultant and non-profit professional with 30 years of leadership experience. After talking with my son about rationality, effective altruism, and AI risks, I decided to pursue a pivot to existential risk reduction work. The last time I had to apply for a job was in 1994. By the end of 2024, I had spent ~740 hours on courses, conferences, meetings with ~140 people, and 21 job applications. I hope that by sharing my experiences, you can gain practical insights, inspiration, and resources to navigate your career transition, especially for those who are later in their career and interested in making an impact in similar fields. I share my experience in 5 sections - sparks, take stock, start, do, meta-learnings, and next steps. [Note - as of 03/05/2025, I am still pursuing my career shift.] Sparks – 2022 During a Saturday bike ride, I admitted to my son, “No, I haven’t heard of effective altruism.” On another ride, I told him, “I'm glad you’re attending the EAGx Berkely conference." Some other time, I said, "Harry Potter and Methods of Rationality sounds interesting. I'll check it out." While playing table tennis, I asked, "What do you mean ChatGPT can't do math? No calculator? Next token prediction?" Around tax-filing time, I responded, "You really think retirement planning is out the window? That only 1 of 2 artificial intelligence futures occurs – humans flourish in a post-scarcity world or humans lose?" These conversations intrigued and concerned me. After many more conversations about rationality, EA, AI risks, and being ready for something new and more impactful, I decided to pivot my career to address my growing concerns about existential risk, particularly AI-related. I am very grateful for those conversations because without them, I am highly confident I would not have spent the last year+ doing that. Take Stock - 2023 I am very concerned about existential risk cause areas in ge
jackva
 ·  · 3m read
 · 
 [Edits on March 10th for clarity, two sub-sections added] Watching what is happening in the world -- with lots of renegotiation of institutional norms within Western democracies and a parallel fracturing of the post-WW2 institutional order -- I do think we, as a community, should more seriously question our priors on the relative value of surgical/targeted and broad system-level interventions. Speaking somewhat roughly, with EA as a movement coming of age in an era where democratic institutions and the rule-based international order were not fundamentally questioned, it seems easy to underestimate how much the world is currently changing and how much riskier a world of stronger institutional and democratic backsliding and weakened international norms might be. Of course, working on these issues might be intractable and possibly there's nothing highly effective for EAs to do on the margin given much attention to these issues from society at large. So, I am not here to confidently state we should be working on these issues more. But I do think in a situation of more downside risk with regards to broad system-level changes and significantly more fluidity, it seems at least worth rigorously asking whether we should shift more attention to work that is less surgical (working on specific risks) and more systemic (working on institutional quality, indirect risk factors, etc.). While there have been many posts along those lines over the past months and there are of course some EA organizations working on these issues, it stil appears like a niche focus in the community and none of the major EA and EA-adjacent orgs (including the one I work for, though I am writing this in a personal capacity) seem to have taken it up as a serious focus and I worry it might be due to baked-in assumptions about the relative value of such work that are outdated in a time where the importance of systemic work has changed in the face of greater threat and fluidity. When the world seems to