This is a repost from a Twitter thread I made last night. It reads a little oddly when presented as a Forum post, but I wanted to have the content shared here for those not on Twitter.
This is a thread of my thoughts and feelings about the actions that led to FTX’s bankruptcy, and the enormous harm that was caused as a result, involving the likely loss of many thousands of innocent people’s savings.
Based on publicly available information, it seems to me more likely than not that senior leadership at FTX used customer deposits to bail out Alameda, despite terms of service prohibiting this, and a (later deleted) tweet from Sam claiming customer deposits are never invested.
Some places making the case for this view include this article from Wall Street Journal, this tweet from jonwu.eth, this article from Bloomberg (and follow on articles).
I am not certain that this is what happened. I haven’t been in contact with anyone at FTX (other than those at Future Fund), except a short email to resign from my unpaid advisor role at Future Fund. If new information vindicates FTX, I will change my view and offer an apology.
But if there was deception and misuse of funds, I am outraged, and I don’t know which emotion is stronger: my utter rage at Sam (and others?) for causing such harm to so many people, or my sadness and self-hatred for falling for this deception.
I want to make it utterly clear: if those involved deceived others and engaged in fraud (whether illegal or not) that may cost many thousands of people their savings, they entirely abandoned the principles of the effective altruism community.
If this is what happened, then I cannot in words convey how strongly I condemn what they did. I had put my trust in Sam, and if he lied and misused customer funds he betrayed me, just as he betrayed his customers, his employees, his investors, & the communities he was a part of.
For years, the EA community has emphasised the importance of integrity, honesty, and the respect of common-sense moral constraints. If customer funds were misused, then Sam did not listen; he must have thought he was above such considerations.
A clear-thinking EA should strongly oppose “ends justify the means” reasoning. I hope to write more soon about this. In the meantime, here are some links to writings produced over the years.
These are some relevant sections from What We Owe The Future:
Here is Toby Ord in The Precipice:
Here is Holden Karnofsky: https://forum.effectivealtruism.org/posts/T975ydo3mx8onH3iS/ea-is-about-maximization-and-maximization-is-perilous
Here are the Centre for Effective Altruism’s Guiding Principles: https://forum.effectivealtruism.org/posts/Zxuksovf23qWgs37J/introducing-cea-s-guiding-principles
If FTX misused customer funds, then I personally will have much to reflect on. Sam and FTX had a lot of goodwill – and some of that goodwill was the result of association with ideas I have spent my career promoting. If that goodwill laundered fraud, I am ashamed.
As a community, too, we will need to reflect on what has happened, and how we could reduce the chance of anything like this from happening again. Yes, we want to make the world better, and yes, we should be ambitious in the pursuit of that.
But that in no way justifies fraud. If you think that you’re the exception, you’re duping yourself.
We must make clear that we do not see ourselves as above common-sense ethical norms, and must engage criticism with humility.
I know that others from inside and outside of the community have worried about the misuse of EA ideas in ways that could cause harm. I used to think these worries, though worth taking seriously, seemed speculative and unlikely.
I was probably wrong. I will be reflecting on this in the days and months to come, and thinking through what should change.
So if I were writing these rules, I might very well rephrase it as "do you have a very strong friendship with this other person" and "do you occasionally spend time at each other's houses" to avoid both allonormativity and the temptation to prurient sniffing; and I'd work hard to keep any disclosed information of that form private, like "don't store in Internet-connected devices or preferably on computers at all" private, to minimize incentives against honest disclosure. And even then, I might expect that among the consequences of the regulation, would be that CEOs in relationships would occasionally just lie to me about it, now that such incentives had been established against disclosure.
When you optimize against visible correlates of possible malfeasance, you optimize first and above all against visibility; and maybe secondarily against possible malfeasance if the visibility is very reliable and the correlations are strong enough to take causal leaning on them.
But, sure, if you know all that and you understand the consequences, then Sequoia could've asked if SBF and Caroline were in a relationship, understanding that a No answer might be a lie given the incentives they'd established, and that a Yes answer indicated unusual honesty.