This article is a writeup of the conversation at a meetup hosted by Austin Less Wrong on Saturday, November 19, 2022. The topic was the collapse of the FTX cryptocurrency exchange and its implications. There were a total of 28 participants, although people came and went throughout the meetup, so the number present at any given time was less than that. This was not an EA meetup, but it attracted more EA-involved people than usual.
Epistemic disclaimer: This writeup should not be read like a normal, well-polished article written by a single author. It contains speculative, off-the-cuff remarks which the speakers may have regarded as tentative even at the time, and which may be easily revised by new evidence. Each statement given here represents the views only of the particular person speaking (which may change from one sentence to the next), and not of the other participants. Below the section break, "I" refers to the current speaker.
General disclaimer: I took pains to make it clear before, during, and after the meetup that I was taking notes for posting on LessWrong later. I do not endorse posting meetup writeups without the knowledge and consent of those present!
What was going on at FTX?
When I first heard the news about FTX being insolvent, I thought they would still honor withdrawals. I figured I'd be fine because I only have money on FTX
.us which is separate from the insolvent entity, but it turned out they were commingling funds—FTX
.us is part of the bankruptcy and so its funds will be distributed to non-US customers. (But FTX bought LedgerX and they seem to be fine—why were they not also included in the bankruptcy?)
I expected Sam Bankman-Fried ("SBF") to have his act together, because he had been rescuing other exchanges.
Early on, a bunch of employees quit Alameda Research, and if these people have overlap with the EA community, I want to know more about whatever scandal caused them to quit. Tara Mac Aulay, the co-founder of Alameda, was one of them. She later co-founded Lantern, and previously worked for the Centre for Effective Altruism (CEA). She wrote a Twitter thread [archive] saying that the people who left Alameda did so "in part due to concerns over risk management and business ethics."
I think FTX would've still been successful without the scamming—they had a reliable business, and the team had widely-applicable skills. Gemini is an example of a billion-dollar business that got there (presumably?) without scamming people. Although, it may be true that FTX wouldn't have gotten quite as big so fast if they hadn't scammed.
It seems like any venture capitalist investing in FTX would've done due diligence and known something was wrong just by looking at the cap table. For example, why did Sequoia invest in them? Was that before all the bad stuff started? But FTX's accounting practices were already bad from the beginning. Maybe they thought that they could make money off of their investment and get out before it crashed—VCs aren't above that.
There was some institution (which one?) that was going to receive a donation from FTX, but did due diligence and decided not to.
What is the historical prior on bank runs happening from before FDIC?
Fraud and its aftermath
How many people knew about the fraud? I think just five: SBF, Caroline Ellison, Gary Wang, Nishad Singh, and Sam Trabucco. But didn't Trabucco leave before this happened? Well, they were doing bad stuff even back when he was CEO. Maybe a few more people than this knew, but I'd be surprised if it was a lot of people, because a lot of FTX employees also got screwed. Trabucco might have seen the writing on the wall.
When did the fraud start? When do you think FTX actually became insolvent? My guess is that it was when Three Arrows Capital went down, which screwed up the complex loans they had done—this was in June. What about the Luna collapse? The Luna collapse was precipitated by Three Arrows Capital. The fraud began as soon as FTX lent money to Alameda; the insolvency came later.
In his interview with Kelsey Piper from Vox [archive], SBF explains that they had FTX depositors wire money to Alameda because FTX didn't have its own bank account. This is when the fraud began. Okay, maybe this is illegal, but I'd argue it wasn't "morally" fraudulent until Alameda started using the money as if it was theirs. But also, this only explains the missing dollars, not the missing cryptocurrency.
SBF goes on to explain that each individual step seemed reasonable. What does he mean by this? Wasn't lending FTX money to Alameda already crossing a clear line into fraudulent and unethical territory, regardless of the perceived risk of these loans? Counterargument: There are things that are illegal but which people don't really consider immoral, like stealing a notepad from your office. Taking depositor funds as a "loan" is worse than this, but of a similar status. There's a notion that "if you win, it's not unethical"—like the story (celebrated in business circles) of Fedex gambling their last $6,000 of investor funds in a casino, without which the company would've failed. It seems okay because of social context—the majority of cryptocurrency funds do this sort of thing. You can see how a sequence of steps that each have a seemingly "controllable" amount of immorality can add up to considerable wrongdoing.
In the same interview he also says that the main thing he regrets is declaring bankruptcy, since otherwise he would've been able to raise enough money to pay everyone back. But that would've just been a straight-up Ponzi scheme. He also says he's still trying to raise money in his own personal capacity. But this seems doubtful since he no longer has any authority to agree to terms.
If he regrets declaring bankruptcy, why did he do so in the first place? Would it have been bad not to? Maybe it would've had adverse implications for his criminal liability if he hadn't.
What will be the outcome of the bankruptcy? Their assets are worth at most 10% of their liabilities. The administrators will take a cut, which I assume will amount to about half of the remaining assets. The team includes the guy who was in charge of the Enron bankruptcy, as well as the former heads of the SEC and CFTC, and a bunch of other big-name feds, so based on what these people make, we can estimate that the bankruptcy proceeding is costing them a lot. • But I will be shocked if the proceeding ends up costing more than $200 million.
The nature of FTX's business
Matt Levine, in an interview with SBF [archive] from April, characterized FTX's activities as a Ponzi scheme. ►But a Ponzi scheme isn't fraudulent per se, because everyone could be fully informed about it. ▻Even a transparent Ponzi scheme is still illegal. ►Really? I didn't think so. (Don't take legal advice from this meetup!) ◩
I've also heard it described as a "flywheel scheme" [archive] or "yield farming" (which is what the Levine interview is about, although the reason for likening it to a Ponzi scheme is not discussed in depth). What is yield farming? You "stake" a token (i.e. lock it up out of use for a fixed period of time) in return for getting a share in the issuance of new tokens, and the inflationary pressure is counterbalanced by the deflationary effect of the coins being locked out of use.
Offering interest to people to reward them for holding a token or using the platform is a common way to get a new token off the ground. Compound has a system where you can deposit cryptocurrencies with them and borrow other ones. For some of their supported currencies, you get paid in Compound's native token (which is used for voting). You pay variable interest on what you've borrowed, and sometimes the value of the Compound token exceeds that of the interest you're paying, so you're effectively being paid to borrow.
During the cryptocurrency boom, FTX was taking money, using that as a war-chest, investing it, and using returns to pay back customers. They say this is not a Ponzi scheme. It kind of is, but it's also kind of like a hedge fund. Have any cryptocurrencies been successfully launched in this way? Yes. Is that what they were doing with the FTX token? Yes.
If I were to do this, I would've done this initially to build up funds, but then de-risk once I had enough.
I am more offended by their incompetence than anything else. They kept their private keys in a shared email account—anyone who knows the first thing about cryptocurrency would know how to store coins in a multisig account with secure offline signing. They commingled FTX and Alameda funds in the same bank account and didn't keep track of how much belonged to each company (as mentioned from the Vox interview above).
Do you believe that FTX actually got hacked, or was this just a way for SBF to funnel money away? The news today was that the hack was done by an insider at the request of the Bahamian government in order to make sure that Bahamian depositors got their money back first. But the wallets had names like "Fuck FTX." Kraken claims they know who the hacker is.
I also heard that they had a backdoor to be able to untraceably move money around; is that true? Everything could very well be true: they forgot about the commingling of funds, and they got hacked, and they had a database backdoor, and the Bahamian government was in on it.
Alameda Research gave personal loans to SBF ($1 billion) and to Nishad ($535 million). What could they possibly be spending that much money on in their personal capacity? This relates to the question of how genuine they were as EAs. I have no beliefs about this; I just think it's weird and confusing. (Were they paying interest on these loans?)
They were also buying houses for their employees (who were also their dating partners). Was this just for fun?
Apparently there's a theory [archive] that supposed FTX co-founder Gary Wang does not actually exist, because the only photo of him [archive] is of the back of his head, and he seems to have zero social media presence. This theory is probably not true. But FTX didn't keep good records of whom they employed, and there are multiple people who were supposedly employees but whom nobody is able to contact to confirm that they exist. (But then again, if you were an FTX employee, wouldn't you be reluctant to respond to random people contacting you now?) This lends credence to the "Gary Wang does not exist" theory, or at least that there were some fake people on the payroll. Why would they do this? To have a scapegoat to pin the blame on? To hide money from creditors?
Scott Alexander wrote about drug use at FTX [archive]: we can infer what drugs they were taking based on photos of their desks. Apparently they were taking some Parkinson's drug (Emsam) and amphetamines (Adderall) at the same time. Scott seemed overconfident that the drug usage wasn't affecting them much, given that the dosages may have been absurdly high, and the doctor prescribing them was insane.
That article mentions that FTX's "onsite happiness counselor" was prescribing drugs to the employees, which is a conflict of interest because he can either serve the company or his patients, but not both. Is the licensing board investigating this guy? FTX probably bribed the licensing board. Was he licensed under Bahamian or American law? I thought he was licensed in America.
I've heard that amphetamines decrease empathy, which may have affected their behavior. (They should've been doing more psychedelic mushrooms!) Note that amphetamines do not decrease empathy for people with ADHD; in fact some studies show they increase empathy—it's nice to have the right level of dopamine in the brain. I've also never heard of amphetamines at a proper dose (say, 20mg at a time maximum) decreasing empathy. People who abuse meth for example generally do 50mg per day or more.
Anecdotally, I checked Erowid for trip reports on Emsam [archive]: At least 2 out of 5 people said "I didn't care about making money before, but now I do." It increases agency and risk-taking behavior, which makes sense as a treatment for depression since depression is basically an abnormal lack of those things; but if you take such a high dosage that you overshoot the norm you might end up behaving like the FTX people did.
An FTX employee I knew was helped by the company in getting stimulants; I don't know if they were already taking them before they joined, however.
Sincerity of their EA beliefs
How genuine was SBF's belief in EA? I think he was genuine early on, but I don't know about more recently. I first heard about SBF a few years ago, via a mutual friend whose knowledge of him I trust, which is why I'm confident that SBF was a genuine EA at least a few years ago. I spent time with his brother, who also seemed genuine.
What about "Caroline's" Tumblr blog—was that even really her? Yes. She's been a longtime fixture of the Tumblr rationalist community. The blog seems very cynical and detached, which may speak to the question of her genuineness. But I met her and she seemed normal, well within a typical EA/rationalist level of cynicism. A friend of mine knew her, and agreed she seemed normal.
Caroline posted something on Tumblr like "How do I convey in my dating profile that despite being a nerdy rationalist I'm also feminine? Should I put this before or after the section about wire fraud?" How serious was this?
Anyway, it's not true that cynicism means that someone isn't genuine. Eliezer Yudkowsky is also really cynical. I know of people who communicate in that way but are sincere. EA is particularly prone to this because any principle taken to an extreme is going to sound absurd, e.g. "Shouldn't we just kill all animals?" It's also a cultural thing—rationalist Tumblr has a peculiar sort of tone.
Attitudes towards risk
SBF was explicit in multiple interviews about his views on risk: as soon as you hear that he has linear utility in money, and that he thinks the St. Petersburg Paradox and Gambler's Ruin are actually valid, this is enough to explain why the company went bust (aside from the ethical aspect of why he was also willing to gamble with other people's money.) In most worlds, a strategy informed by this view ends in ruin.
If I had known he would bite the bullet on St. Petersburg I would've said "Oh, shit...." In fact, he said this a long time ago. But some other EAs have also said the same thing, i.e. that utility should be linear in money if you're doing good for others instead of just accumulating personal wealth. (E.g. [references found afterward] Brian Tomasik [archive]: "Effective altruists often see why altruistic risk-aversion doesn't make sense: If you're trying to actually help as much as possible, then sparing ten animals from suffering is actually ten times as good as sparing one, even if the dopamine release in the donor's brain is not ten times as high." Stuart Armstrong: "In my estimation, the expected utility for the singularity institute's budget grows much faster than linearly with cash.")
But if you bet everything you have, the losses don't taper off. It's mixing up two separate questions. Even if the utility of money is linear, this still doesn't imply that you should keep pulling the lever on the St. Petersburg machine forever, only that you should pull it more than someone who regarded the utility of money as diminishing.
At any rate, even if you were a pure "make number go up even if it involves unethical actions" utilitarian, you still wouldn't have displayed such incompetence as to e.g. commingle funds in a poorly-labeled bank account or store private keys in a shared email account. These were entirely unnecessary risks that can't be justified by any utilitarian calculus. Even if they weren't trying to behave ethically, they still should've had an incentive not to let the business fail.
I never would've thought that ignoring the Kelly criterion would have such concrete implications: it would cause you to want to keep doing double-or-nothing bets forever. If you exceed logarithmic utility even by a factor of 2, then you will be ruined with probability 1.
On the ethical front: Given that they were gambling with other people's money, they were playing a zero-sum game. When they double their money, that's at the expense of someone else losing the same amount. You would have to assert not just that utility is linear in money per se, but that the utility of redistributing money from depositors to FTX is also linear, which is a much stronger claim.
In doing that, there's also the risk of going to jail. One's world-model might not take adequate account of unknown unknowns. Rationalists are prone to this blind spot, and therefore tend to bet too strongly.
Is SBF consoling himself that there's some other branch of the wave-function in which they saved the world? Maybe he still thinks we're on it.
Who was aware of who SBF was and had an opinion of him? [~8 people.] Who was aware but had no opinion? [~5 people.]
Is anyone here negatively affected by this? • I'm losing about $20,000 worth of cryptocurrency I kept on FTX. • I have cryptocurrency and it lost value. • I work doing audits for blockchain projects, so if all of them fail I'll be out of a job. • Not directly, but I was planning to enter the FTX Future Fund essay contest. • So was I.
Someone I know worked at FTX and was interviewed by Yahoo Finance about it [archive] last year. He says he tried to talk SBF out of going into cryptocurrency, but later changed his mind about the field.
The reporting in the New York Times and Forbes has been great for SBF—they're just saying things like "it's unfortunate." He has tendrils in the media and they're doing softball coverage (especially the New York Times). Is this because he was a big Democrat donor? But the Democrats are going to want to distance themselves from him now, because you only have political capital if the politicians think you're going to keep giving them money. Also, someone else at FTX gave big money to Republicans.
There will be negative spillover onto EA. The "woke" segment has been critical of EA for a while, and they're popping the champagne right now: "Oh, look at this ridiculous grant to 'ride my bicycle and see if it helps me relax...'." The FTX situation invites people to scrutinize goofy use of funding.
As per the Copenhagen interpretation of ethics [archive], EAs are now being judged more harshly than people who were never even trying to do good. Regardless of whether the EA projects are goofy, they're going to get much more scrutiny than they deserve. Once the internet hivemind is activated, people won't criticize claims by people on the same side.
There's a natural human tendency to punish others who are over-altruistic, because we're evolved to not want very high standards of altruism in our society. Maybe people found it hard to criticize SBF before this, but now the floodgates are open. There's no more irresistible target than someone who sets themselves up as a paragon of morality but is then exposed as false.
Media reports argue: EA was all about forecasting existential risk, but they didn't perceive the existential risk to their own community—ironic! This is a silly argument—if SBF got in a car accident, would that mean he was wrong about existential risk?
What about the Culture War angle? ►FTX's political activities will shield EA from the bulk of the PR fallout—people on the right are gleeful that a Democrat donor was caught in a scandal, and people on the left don't want to talk about it. ▻But the narrative can shift quickly if a few mainstream outlets start saying that EA caused this. ►Most people will probably continue not caring about EA or knowing what it is. They care more about the political donations. People will only care about EA if it is lumped together with Democrats as part of a general smear of the left by the right. That's the way Culture War discourse works.
▻But even if Culture War eats everything, isn't EA already involved because it's seen as "evil eugenics" etc.? There was a recent article about how "Billionaires like Elon Musk want to save civilization by having tons of genetically superior kids," which associates eugenics with transhumanism, longtermism, and EA. There was a discussion about "eugenics in ML" involving Timnit Gebru (who was controversially fired from Google), Melanie Mitchell, Gary Marcus, and Amanda Askell (who was defending EA). In my bubble a lot of people know about EA and criticize it from the political left.
►But this is definitely a social bubble. Even within bigtech companies hardly anyone knows about EA. ▻Are you saying most "normies" aren't involved in Culture War at all? ►Yes; active Culture Warriors are only a small section who have strong opinions. Even that is mostly focused on elections etc. not EA. People on Twitter are mostly just talking about sports. ◩
Sure, more people will think worse of EA now than will think better of it, but it doesn't seem like the amount of PR damage is big enough to make me think the future is doomed.
Morality of returning funds donated by FTX
Setting aside the question of legal obligations, are recipients of FTX donations morally obligated to return funds?
►No, since it's like a plumber who got paid to fix the toilet at the FTX office, who nobody would say has any obligation. ▻But in that case the money has already been spent. The employees of FTX are no longer going to get paid going forward, so neither should grant recipients. ►The money has also already been spent in this case (by giving it to EA organizations). ▻But the services have not yet been rendered. ►The services have already been rendered by the organization's act of taking the money, and people have already started doing things in the expectation of getting the money. ▻But the same applies to employees. Or, maybe you're morally allowed to keep whatever compensates for the costs you've already sunk, but you should return the rest.
▻What if there were two donation recipients, one of which was wired the money the day before the collapse and the other of which was scheduled to get it the next day, but didn't? Is the first entitled to the money and the second not? It seems weird to attach moral significance to such a technicality. ►Both would be morally owed money. ▻But the problem is that more money is owed in total than money exists; that's why bankruptcy must prioritize claims. Employees have a very junior claim; depositors have the most senior claim. (Is this true? What about investors?) ◩
What if FTX bought me $10,000 of fine liquor and cigars at a fancy party? I'd still feel guilty about this even though there would be nothing I could give back.
In the Bernie Madoff case, withdrawals were clawed back on the grounds that it's unfair for someone who withdrew early to be treated differently from someone who didn't, and this was regardless of whether the money had already been spent. (I'm not sure I agree with this legal principle, but I think that's what the law is.) We don't want to create a perverse incentive to spend money immediately, so we need to treat spenders and non-spenders alike (either claw back from both, or from neither).
As a practical matter, even if you wanted to return the money, it's not clear how you would do this, or whether the fraud victims would actually end up with it.
Safely using cryptocurrency
How would you like to brag about your foresight? • I'm glad I didn't invest in cryptocurrency. • I'm glad I kept my coins on multiple exchanges. • I'm glad I only used exchanges that were subject to US regulations. (But didn't that include FTX? Possibly they were US-regulated but just didn't comply.) • I specifically avoided FTX because I also work in EA—for the same reason that Google employees sell their stock immediately upon vesting, to avoid overexposure. • ►It's difficult and ugh-fieldy to get involved in complex cryptocurrency stuff, so my akrasia and slow-moving approach saved me. (▻But won't that also make you miss out on opportunities? ►Yes, but the returns are asymmetric. ▻Depends on your prior of the whole cryptocurrency ecosystem.) ◩
Don't use centralized exchanges—"Not Your Keys, Not Your Coins." If I were to use one, I'd lean towards Gemini or Coinbase, which are the most US-regulation-compliant. Gemini is based in New York, which has even stricter regulations than the rest of the US.
But isn't it safer to hold coins on an exchange than to self-custody? I don't think so; people are overestimating the risk of losing their keys and underestimating the risk of using an exchange.
Can I make money trading without using a centralized exchange? Yes, that's what Alameda Research itself was doing (i.e. DeFi exchanging)—the fees are pretty low, especially now since the Ethereum fee restructuring (EIP-1559, which happened around August of this year).
On the other hand: If you want to earn interest, you must accept counterparty risk. If you want to do short-selling, you can avoid centralization by using a liquidity pool, but the fees add up quickly unless you open and close your position all at once, which is generally not the best bet. If you open and close your position in a series of small transactions, you accumulate a lot of fees.
There is a risk of smart contracts being hacked. Many of them determine the value of collateral by looking only at a short span of recent history, so it's possible to artificially prop up the value in order to drain assets from the contract. (Compound solves this problem by looking only at long-established coins.) DeFi protocols also have bugs—how can you tell if new ones are reliable? Uniswap and Compound have been good so far. Compound had their native tokens stolen, but not depositor money.
This is a reminder not to invest more in cryptocurrency than you can afford to lose. But FTX depositors were screwed not because they invested in cryptocurrency, but because they held it on FTX. So, consider "invest" in a broad sense—any way you store money has some risk.
Future EA projects
We EAs will need to tighten our belts a bit. But the lack of funds will only be as bad as we let it be—Open Phil is still funding, and the community will course-correct to get more donors and encourage more earning-to-give. I hope that the projects getting funded will now be of higher quality, which could be better for EA—there's the cliche that "your first idea is not your best idea," but when there was so much money flying around, people didn't need to have their best ideas to get funded. Now they'll have to improve their ideas and ramp up hiring slower. There is a way that EA can come out of this stronger, without a huge loss of funding over the next few years.
I wish Open Phil were more "weird." I was excited about the FTX Future Fund because they were willing to... I want to say "take risks" but that seems like a poor choice of words. We need more ambitious megaprojects, but Open Phil is too cautious. But Open Phil grantmakers have been overtly changing their course in the last year or so, and acknowledging that EA-hours are more valuable than dollars.
I don't think the FTX collapse means we should be more risk-averse generally—this was a specific bad actor who colluded to hide things. If the top people in tech and business are actively trying to hide things, you can't have known about it unless you're doing a lot of external auditing, which apparently wasn't happening in this case. Also, I'm still in favor of EA people starting companies.