Caroline Ellison, co-CEO and later CEO of Alameda, had a now-deleted blog, "worldoptimization" on Tumblr.  One does not usually post excerpts from deleted blogs - the Internet has, of course, saved it by now - but it looks like Caroline violated enough deontology to be less protected than usual in turn, and also I think it's important for people to see what signals are apparently not reliable signs of honesty and goodness.

In a post on Oct 10 2022, Caroline Ellison crossposted her Goodreads review of The Golden Enclaves, book 3 of Scholomance by Naomi Novik.  Caroline Ellison writes, including very light / abstract spoilers only:

A pretty good conclusion to the series.

Biggest pro was the resolution of mysteries/open questions from the first two books. It wrapped everything up in a way that felt very satisfying.

Biggest con was … I think I felt less bought into the ethics of the story than I had for the previous two books?

The first two books often have a vibe of “you can either do the thing that’s easy and safe or you can do the thing that’s hard and scary but right, and being a good person is doing the right thing.” And I’m super on board with that.

Whereas if I had to sum up the moral message of the third book I might go with “there is no ethical consumption under late capitalism.”

For someone like myself, this is a pretty shocking thing to hear somebody say, on a Tumblr blog not then associated with their main corporate persona, not in a way that sounds like the usual performativity, not like it's meant to impress anybody (because then you're probably not writing about anything as undignified as fantasy fiction in the first place).  It sounds like - Caroline might have been under the impression, as late as Oct 10, that what she was doing at FTX was the thing that's hard and scary but right?  That she was doing, even, what Naomi Novik would have told her to do?

The Scholomance novels feature a protagonist, Galadriel Higgins, with unusually dark and scary powers, with a dark and scary prophecy about herself, trying to do the right thing anyways and being misinterpreted by her classmates, in an incredibly hostile environment.

The line of causality seems clear - Naomi Novik, by telling her readers to do the right thing, probably contributed to Caroline Ellison doing what she thought was the right thing - misusing Alameda's customer deposits.  Furthermore, the Scholomance novels romanticized people with dark and scary powers, and those people not just immediately killing themselves in the face of a prophecy that they'd do immense harm later, i.e., sending the message that it's okay for them to take huge risks with other people's interests.

I expect this to be a very serious blow to Naomi Novik's reputation, possibly the reputation of fantasy fiction in general.  The now-deleted Tumblr post is tantamount to a declaration that Caroline Ellison was doing this because she thought Naomi Novik told her to.  We can infer that probably at least $30 of Scholomance sales are due to Caroline Ellison, and with the resources that Ellison commanded as co-CEO of Alameda, some unknown other fraction of Scholomance's entire revenues could have been due to phantom purchases that Ellison funded in order to channel customer deposits to her favorite author.


My moral here?  It can also be summed up in an old joke that goes as follows:  "He has no right to make himself that small; he is not that great."

The best summary of the FTX affair that I've read so is Milky Eggs's "What Happened at Alameda Research?"  If you haven't read it already, and you're at all interested in this affair, I recommend that you go read it right now.

Pieced together from various sources, including some allegedly shared from FTX-employees (and including some comments posted by those to the Effective Altruism forum), Milky Eggs pieces together a harrowing story of how Alameda Research probably lost in excess of $15 billion dollars.  Primary causative factors:

  • Their actual arb strategies stopped working, and were frog-boilingly gradually replaced with long bets on crypto that paid out during the boom and exploded during the bust;
  • Poor accounting, possibly just no really global accounting or sense of where the money was going;
  • Excessive use of stimulants, including those known to result in compulsive gambling behavior;
  • A corporate acquisitions spree, possibly partially motivated by buying up corporate entities that held the FTT token and could have taked the market by dumping it, maybe even raiding those companies for their own customer deposits;
  • A general lack of spending discipline: for example, buying naming rights to the e-sports organization TSM for $210M, which was way out of line to comparable deals in e-sports.

Completely missing from Milky Eggs's account:  Any mention of effective altruism, except that the EA Forum is listed as a source for some of their alleged-ex-FTX-employee accounts.

Why?

Because - and I say this meaning it gently, and with kindness - you were not that fucking important.

The amount that FTX spent on e-sports naming rights for TSM was greater than everything they donated to effective altruism.

Can you imagine how you'd judge it if, rather than my writing it as a joke, Naomi Novik had gone online and sincerely tried to accept blame for FTX's fall, because she thought she hadn't been careful enough to put messages about good corporate governance and careful accounting into her fantasy novels, and Novik had talked about how she was planning to donate an appropriate portion of her Scholomance book royalties back to FTX's ruined customers?  Depending on her state of mind, you might either try to gently console her and somehow get her to realize that she was being way too scrupulous and might possibly want to try standard meds for OCD at some point; or, on another hypothesis about Novik's state of mind, you might try to gently explain that she's not the center of the universe and that this wasn't mostly about her.

This would be true even if Sam Bankman-Fried himself had presented as a Naomi Novik fan, if he had told others that he wanted to be a Novik-style DoTheRightThingist just like Galadriel Higgins the Scholomance protagonist, and he had funneled $140M to causes having to do with things that were on-theme for some of Novik's books.  The $140M would still be less than FTX had spent on e-sports naming rights.  SBF calling himself a Novikian RightThingist would not have been much of a factor in why he was trusted, compared to their claims of being the first GAAP-audited crypto exchange and so on.

There probably would be some sort of weird blowup in the Novik fandom, in that case, it would make more sense for them to wonder if they were responsible.  But I'd expect people in the Novik fandom to also vastly overestimate how much it was all about them, in that case; because they would know all about Novik, but have less daily exposure to the much wider world in which FTX operated.  They'd have heard about the money donated to RightThingism but not about the e-sports naming rights.  They would not realize that there were other and bigger fish in the pond.

(Be it clear, I'm not analogizing myself to Novik in that metaphor.  I'm analogizing Peter Singer and classical Givewell-style EA to Novik.  I asked SBF if he wanted to meet with me ever, he never got around to it, I do not think he was a Yudkowsky fan and he hung out with some EAs who definitely weren't.)

(ADDED:  I am not saying that EA influence on Alameda was comparable in magnitude to Novik's influence on Caroline Ellison; I am giving an example of the mental motion of trying to grab too much responsibility because you don't know about all the parts of the universe that aren't yourself.)

It wouldn't, even, reflect all that badly on the spirit running across many fantasy novels of RightThingism.  Not just because "no true Scotsman", not even because SBF would have really actually missed the point of fantasy-novel RightThingism.  But because the amount that FTX spent on e-sports naming rights vs the amount they gave to RightThingist causes, and how they didn't take a billion off the table for RightThingism while they still had a billion, maybe belied a bit the idea that RightThingism was in fact that central to their mental lives.

Also Milky Eggs's account says that FTX's own employees were encouraged to keep all their salaries on the exchange, which... I don't really have words.  It's not - what you'd expect somebody to do if they still had even fantasy-novel RightThingism inside them.  The Milky Eggs account says that Caroline Ellison was one of four FTX employees who knew.  I wish I had a reliable printout of what Caroline Ellison was actually thinking at the time she wrote that Tumblr post.  I would bet that, even without the benefit of hindsight on how it turned out, Naomi Novik wouldn't have agreed with it at the time.

And whatever Caroline Ellison was thinking when she wrote that, it is obvious - when you look at it from safely outside - that it wasn't Naomi Novik's fault.

If Caroline Ellison had worn a Naomi Novik T-shirt and put the Scholomance books in her Twitter profile and told her crypto clients "Trust me, I read fantasy novels and I know what the Right Thing is," it would still not have been Naomi Novik's fault.

It wouldn't have been the fault of the abstract concept of “you can either do the thing that’s easy and safe or you can do the thing that’s hard and scary but right, and being a good person is doing the right thing”.  Plenty of people have read fantasy novels like that and not wrecked depository institutions.  Not just in terms of moral responsibility, but actual causality, I'd be surprised if that was really in actual fact a key driver in the decisions that Caroline Ellison made; maybe she used that to rationalize that afterwards, but I doubt it's what was going through her mind on the fatal day that FTX used customer deposits to pay back Alameda creditors (if that's in fact when FTX first touched customer deposits).  Pride did it, I'd sooner guess, or the desire to not not not be in this universe going so badly and taking the only step that preserved the feeling that everything could still be okay.

Who's at fault for FTX's wrongdoing?

FTX.

Ask a simple question, get a simple answer.

You have no right to blame yourself any more than that.  You weren't that important.


If there's anyone other than FTX who's really to blame, here, it's me.  I've written some fiction that tries to walk people through the experience of abandoning sunk costs and facing reality.  Including my most recent work.

Caroline Ellison, according to her tumblr, had even started reading it...

But her liveblogs cut out before she got very far in.

I just wasn't a good-enough writer; I lost my reader's attention, and with it, perhaps, the world.



Now, some people might say here:  "But Eliezer, aren't you co-writing that story with another author?"  And to this I can only reply:  I see no reason why the existence of any other people in the universe ought to detract from my own sole accountability for everything that anyone does inside it.

Comments88
Sorted by Click to highlight new comments since: Today at 4:36 AM
Some comments are truncated due to high volume. (⌘F to expand all)Change truncation settings

I'm analogizing Peter Singer and classical Givewell-style EA to Novik.

What about the parts of EA that isn't Peter Singer and classical GiveWell-style EA? If those parts of EA were somewhat responsible, would it be reasonable to call that EA as well?

I don't think the analogy is helpful. Naomi Novik presumably does not claim to emphasize the importance of understanding tail risks. Naomi presumably didn't meet Caroline and encourage her to earn a lot of money so she can donate to fantasy authors, nor did Caroline say "I'm earning all of this money so I can fund Naomi Novik's fantasy writing". Naomi Novik did not have Caroline on her website as a success story of "this is why you should earn money to buy fantasy books or support other fantasy writers".  Naomi didn't have a "Fantasy writer's fund" with the FTX brand on it. 

I think it's reasonable to preach patience if you think people are jumping too quickly to blame themselves. I think it's reasonable to think that EA is actually less responsible than the current state of discourse on the forum. And I'm not making a claim about the extent EA is in fact responsible for the events. But the analogy as written is pretty poor, and... (read more)

I agree that if I, personally, had steered SBF into crypto, and uncharacteristically failed to add on a lot of "hey but please don't scam people, only do this if you find a kind of crypto you can feel good about" I might consider myself more at fault.  I even think that the Singer side of EA in fact does less talking about deontology, less writing of fiction that exemplifies the feelings and reasoning behind that deontology, less cautioning of people against twisting up their brains by chasing good ideas; on my view, the Singer side explicitly starts by trying to twist people's brains up internally, and at some point we should all maybe have a conversation about that.

The thing is, if you want to be sane about this sort of thing, even so and regardless I think Peter Singer himself would not have approved this, would obviously not have approved this.  When somebody goes that far off the rails, I just don't see how you could reasonably hold responsible people who didn't tell them to do that and would've obviously not wanted them to do that.

I agree that if I, personally, had steered SBF into crypto, and uncharacteristically failed to add on a lot of "hey but please don't scam people, only do this if you find a kind of crypto you can feel good about" I might consider myself more at fault.

Given how big of a role EA apparently had in the origin of Alameda (Singh says in the Sequoia puff piece that it wouldn’t have started without EA), there very likely are many members of the community who offered more encouragement and/or didn’t give as many warnings as they should have.

I don’t know what point that fault transcends the individual and attributes to the community, but at the very least, adding up other individuals’ culpabilities in steering SBF to crypto without appropriate caution would seem to put a lot of the blame you say you personally avoid on EA as a whole.

Habryka
1y124
16
0

DM conversation I had with Eliezer in response to this post. Since it was a private convo and I was writing quickly I had somewhat exaggerated in a few places that I've now indicated with edits.

Habryka

Hmm, I do feel like I maybe want to have some kind of public debate about whether indeed we could have noticed that a bunch of stuff about FTX was noticeable, and whether we have some substantial blame to carry. 

Like, to be clear, I think the vast majority of EAs had little they could have or should have done here. But I think that I, and a bunch of people in the EA leadership, had the ability to actually do something about this. 

I sent emails in which I warned people of SBF. I had had messages written but that I never sent that seem to me like if I had sent them they would have actually caused people to realize a bunch of inconsistencies in Sam's story. I had sat down my whole team, swore them to secrecy, and told them various pretty clearly illegal things that I heard Sam had done [sadly all uncomfirmed, asking for confidentiality and only in rumors] that convinced me that we should avoid doing business with him as much as possible (this was when we were considering whethe

... (read more)

Commending Habryka for willing to share about these things. It takes courage and I think reflections/discussions like this could be really valuable (perhaps essential) to the EA community having the kind of reckoning about FTX that we need.

Great points, all. Even if most people could do nothing and Sam was not motivated by a core problem with EA philosophy, that doesn’t mean there was nothing that EAs close to the situation could have done differently. I would love to see a public airing of what genuine evidence people think they might have had that should have changed those people’s behavior around Sam.

I think I share more like 30% of friends with FTX leadership

Assuming that this means that the FTX leadership is friends with prominent EAs, I think that this fact raises some questions that many people might consider important.

For instance, I think some people might find it important to know what those friends have been doing with respect to this situation for the past week. What sort of communication have they had with the FTX leadership? Do they still feel loyalty toward SBF/Caroline/etc.? Are they in any way aiding or abetting them to commit crimes or avoid the legal or reputational consequences of their actions?

These might be dumb questions, and I apologize if so. They occurred to me because I model people as being quite likely to aid and abet with their close friends' criminal or malicious activity, but I acknowledge that that model could be wrong and/or not very applicable to this situation.

Here are some excerpts from Sequoia Capital's profile on SBF (published September 2022, now pulled). 

On career choice: 

Not long before interning at Jane Street, SBF had a meeting with Will MacAskill, a young Oxford-educated philosopher who was then just completing his PhD. Over lunch at the Au Bon Pain outside Harvard Square, MacAskill laid out the principles of effective altruism (EA). The math, MacAskill argued, means that if one’s goal is to optimize one’s life for doing good, often most good can be done by choosing to make the most money possible—in order to give it all away. “Earn to give,” urged MacAskill. 

... 

It was his fellow [fraternity members] who introduced SBF to EA and then to MacAskill, who was, at that point, still virtually unknown. MacAskill was visiting MIT in search of volunteers willing to sign on to his earn-to-give program. 

At a café table in Cambridge, Massachusetts, MacAskill laid out his idea as if it were a business plan: a strategic investment with a return measured in human lives. The opportunity was big, MacAskill argued, because, in the developing world, life was still unconscionably cheap. Just do the math: At $2,000 per life

... (read more)

Some corrections of the Sequoia info:

  • I've never been a grad student.
  • I'm neither Japanese nor a Japanese citizen.
  • I ‘volunteered’ in the sense that people at Alameda reached out to me, I said ok and then got paid by the hour for my help.
  • ‘(obscure, rural)’ is an exaggeration. ‘provincial’ would be a more apt adjective for the location. The main bank we used was SMBC, the second-largest bank in Japan.
  • ‘for a fee’ sounds as if it was some sort of bribe to get them to do what we wanted. But we only paid the usual transaction fees and margin that any bank would charge.

But mostly, if https://forum.effectivealtruism.org/posts/xafpj3on76uRDoBja/the-ftx-future-fund-team-has-resigned-1?commentId=hpP8EjEt9zTmWKFRy is accurate, I'm bummed that the money I helped earn was squandered right away.

Definitely: you are obviously right and Eliezer obviously wrong about this, imho. 


BUT

I do think it is hindsight bias to some degree to think that "EA" as a collective or Will MacAskill as an individual are recorded as doing something wrong, in the sense of "predictably a bad idea" at any point in the passages you quote. (I know you didn't actually claim that!) It's not immoral to tell some to found a business, so it's definitely not immoral to tell someone to found a business and give to charity. It's not immoral to help someone make a legal, non-scammy trade, as the anonymous Japanese EA apparently did ("buy low and sell high" is not poor business ethics as far as I know, though I'm prepared to be corrected about that by someone who actually knows finance.) It's a bit more controversial to say it's not wrong to take very rich people's money to do the sort of work EA charities do, but it's certainly not obvious that it is, and nothing in the quoted passages actually shows that any individual had evidence that FTX were a bad org to be associated with. (They may well have, I'm not saying no one did wrong, I'm just saying no wrong-doing is suggested by the information quoted here... (read more)

Thanks for this comment. 

I'm more interested in reflecting on the foundational issues in EA-style thinking that contributed to the FTX debacle than in abscribing wrongdoing or immorality (though I agree that the whole episode should be thoroughly investigated). 

Examples of foundational issues: 

  • FTX was an explicitly maximalist project, and maximization is perilous 
  • Following a utilitarian logic, FTX/Alameda pursued a high-leverage strategy (Caroline on leverage);  the decision to pursue this strategy didn't account for the massive externalities that resulted from its failure 
  • The Future Fund failed to identify an existential risk to its own operation, which casts doubt on their/our ability to perform risk assessment 
  • EA's inability and/or unwillingness to vet FTX's operations (lack of financial controls, lack of board oversight, no ring-fence around funds committed to the Future Fund) and SBF's history of questionable leadership points to overeager power-seeking  
  • MacAskill's attempt to broker an SBF <> Elon deal re: purchasing Twitter also points to overeager power-seeking 
  • Consequentialism straightforwardly implies that the ends justif
... (read more)
8
RobBensinger
1y
Sounds right to me! I agree with Eliezer that a lot of EAs are over-blaming EA for the FTX implosion, based on the facts currently known. But the Scholomance case is obviously a lot weaker than the EA case in real life, and this is a great summary of why.

The point is not "EA did as little to shape Alameda as Novik did to shape Alameda" but "here is an example of the mental motion of trying to grab too much responsibility for yourself".

2
RobBensinger
1y
Fair!

This seems to be a false equivalence. There's a big difference between asking "did this writer, who wrote a bit about ethics and this person read, influence this person?" vs "did this philosophy and social movement, which focuses on ethics and this person explicitly said they were inspired by, influence this person?"

I agree with you that the question

Who's at fault for FTX's wrongdoing?

has the answer

FTX

But the question

Who else is at fault for FTX's wrongdoing?

 Is nevertheless sensible and cannot have the answer FTX.

Rina
1y13
5
0

Couldn't agree more strongly.

The inferential jump from someone reading a book in their spare time, making a pretty superficial Goodreads review about a main takeaway, to

It sounds like - Caroline might have been under the impression, as late as Oct 10, that what she was doing at FTX was the thing that's hard and scary but right?

Is a pretty big one, and kinda egregious honestly.

Agree, we shouldn't give a pass to irrational (frankly, egocentric) thinking just because it feels like taking responsibility. 

I feel especially irritated with people who are ready to change their entire utilitarian philosophy just because someone associated with ours (probably) committed a major crime and got caught, as if they didn't understand last week that they lived in a world where surprises like that can happen. I don't understand how else they could update their moral philosophy so fast based on the info we have.

Maybe they weren't familiar with the overwhelming volume of previous historical incidents, hadn't had their brains process history or the news as real events rather than mythology, or were genuinely unsure about how often these sorts of things happened in real life rather than becoming available on the news.  I'm guessing #2.

I feel especially irritated with people who are ready to change their entire utilitarian philosophy just because someone associated with ours (probably) committed a major crime and got caught

I agree that this is pretty weird. There were presumably a bunch of historical contingencies that went into whether the FTX implosion occurred; it seems weird if we should endorse some moral philosophy X in the world where all those contingencies occurred, and some different moral philosophy Y in the world where not all of those contingencies occurred.

And it also seems weird if we should endorse the same moral philosophy in both worlds, but this one data point -- an important data point EV-wise, but still a single event, historical contingency and all -- is crucial evidence about such a high-level proposition. Evidence that we somehow didn't acquire via looking at the entirety of human history, the entire psychology and sociology literature, etc.

The least-weird versions of this update I can imagine are:

  • "This isn't a large update about high-level questions like that, but it's at least an interesting case study. We shouldn't treat it as a huge deal evidentially, but having a Schelling case study
... (read more)
5
Rebecca
1y
There’s also a 3rd option - we should have been updating based on what was already talked about re SBF before the implosion (his pathological behaviour, his public statements essentially agreeing he’s running a Ponzi scheme, and people warning other people about these). So the implosion makes us realise that, in a world where FTX didn’t implode we still should have disassociated from SBF very early on, and be doing some soul searching about why UK EA leaders were [/are, in this hypothetical world] choosing to hype up someone with a track record of being so terrible

I think it’s very worth reflecting on strategic decisions that were made around Sam. I just don’t think what happened is very significant to whether utilitarianism is the correct moral philosophy.

6
RyanCarey
1y
I agree that these events are separate from arguments for & against utilitarianism as a criterion of rightness. But they do undermine the viability of the act utilitarian calculus as a decision procedure. Sam seems to have thought of himself as an act utilitarian, but by neglecting to do the utilitarian calculus correctly or at all, he did massive harm, making it clear that we can't rely on this decision procedure to avoid such harms. Instead, we need utilitarians to  adopt a decision procedure that includes constraints on certain behaviour.
  • In practice I think utilitarians should adopt mostly a skillful combination of virtue ethics, deontic rules, and explicit calculations. 
  • I think what does the FTX case provides some evidence for, is some fraction of smart EAs exposed to utilitarianism being prone to attempt to rely on the explicit act utilitarianism, despite the warnings.

    I think part of the story here is the a weird status dynamic where...
    1. I would basically trust some  people to try the explicit direct utilitarian thing: eg I think it is fine for Derek Parfit or Toby Ord. 
    2. This creates some weird correlation where the better you are on some combination of (smartness/understanding of ethics/power in modelling the world), the more you can try to be actually guided by consequences
    3. This can make being 'hardcore' consequentialist ...sort of  cool and "what the top people do"
    4. ... which is a setup where people can start goodhart/signal on it

     

Yeah, I think it's a severe problem that if you are good at decision theory you can in fact validly grab big old chunks of deontology directly out of consequentialism including lots of the cautionary parts, or to put it perhaps a bit more sharply, a coherent superintelligence with a nice utility function does not in fact need deontology; and if you tell that to a certain kind of person they will in fact decide that they'd be cooler if they were superintelligences so they must be really skillful at deriving deontology from decision theory and therefore they can discard the deontology and just do what the decision theory does.  I'm not sure how to handle this; I think that the concept of "cognitohazard" gets vastly overplayed around here, but there's still true facts that cause a certain kind of person to predictably get their brain stuck on them, and this could plausibly be one of them.  It's also too important of a fact (eg to alignment) for "keep it completely secret" to be a plausible option either.

Sam seems to have thought of himself as an act utilitarian, but by neglecting to do the utilitarian calculus correctly or at all, he did massive harm, making it clear that we can't rely on this decision procedure to avoid such harms.


I completely agree that a motivated person could easily believe that any decision is the right act utilitarian decision because there aren't clear rules for determining the right act utilitarian decision and checking your answer. Totally.  

But idk if it's even fair to say Sam was using act utilitarianism as a decision procedure. It's not clear to me if he even believed that while he was (allegedly) committing the fraud.

8
RyanCarey
1y
I totally agree. But even if we conservatively say that it's a 50% chance that he was using act utilitarianism as his decision procedure, that's enough to consider it compromised, because it could lead to bad consequences multiple billions of dollars of damages (edited). There are also subtler issues: if you intend to be act utilitarian but aren't and do harm, that's still an argument against intending to use the decision procedure. And if someone says they're act utilitarian but aren't and does harm, that's an argument against trusting people who say they're act utilitarian.

Not trying to take this out on you, but I'm annoyed by how much all this advocacy of deontology all of sudden overlaps with covering our own asses. I don't buy it as a massive update about morality or psychology from the events themselves but a massive update about optics. 

Reposting from twitter: It's a moderate update on the prevalence of naive  utilitarians among EAs.

Expanded:

Classical problem with this debate on utilitarianism is the vocabulary used makes motte-and-bailey  defense of utilitarianism too easy. 
1. Someone points to a bunch problems with a act consequentialist decision procedure / cases where naive consequentialism tells you to do bad things
2. The default response is "but this is naive  consequentialism, no one actually does  that" 
3.  You may wonder that while  people don't advocate  for or self-identify  as naive utilitarians ... they   actually make the mistakes

The case provides some evidence that the problems can actually happen in practice in important enough situations to care. [*]

Also, you have the problem that sophisticated naive  consequentialists could be tempted to lie to you about their morality ("no worries, you can trust me, I'm following the sensible deontic constraints!").  Personally, before the recent FTX happenings, I would be more of the opinion "nah, this sounds too much like an example from a philosophical paper, unlikely with typical human psy... (read more)

I'd agree with this statement more if it acknowledged the extent to which most human minds have the kind of propositional separation between "morality" and "optics" that obtained financially between FTX and Alameda.

3
Linch
1y
This will be a relief if true. I am much more worried about people not having principles (or their principles guided by something other than morality) than people being overly concerned about optics. The latter is a tactical concern (albeit a big one) and hopefully fixable, the former is evidence that people in our movement is too conformist or otherwise too weak or too evil to confront moral catastrophes.
2
Holly_Elmore
1y
I don’t think they know they are concerned about optics. My suspicion was that the bad optics suddenly made utilitarian ideas seem false or reckless.

This strikes me as a bad play of "if there was even a chance".   Is there any cognitive procedure on Earth that passes the standard of "Nobody ever might have been using this cognitive procedure at the time they made $mistake?"  That more than three human beings have ever used?  I think when we're casting this kind of shade we ought to be pretty darned sure, preferably in the form of prior documentation that we think was honest, about what thought process was going on at the time.

Why require surety, when we can reason statistically? There've been maybe ten comparably-sized frauds ever, so on expectation, hardline act utilitarians like Sam have been responsible for 5% of the worst frauds, while they represent maybe 1/50M of the world's population (based on what I know of his views 5-10yrs ago). So we get a risk ratio of about a million to 1, more than enough to worry about.

Anyway, perhaps it's not worth arguing, since it might become clearer over time what his philosophical commitments were.

9
Holly_Elmore
1y
I guess it's some new evidence that one person was maybe using act utilitarianism as a decision procedure and messed up? Also not theoretically impossible he was correct in his assessment of the possible outcomes, chose the higher EV option, and we just ended up in one of the bad outcome worlds.
0
RobBensinger
1y
I don't understand this argument at all. I assume nobody thought it was literally impossible for the implementation of a moral theory (any moral theory!) to lead to bad consequences before. Maybe I'd understand your point more if you stated it quantitatively. Like: "Previously, I thought it was x% likely that a random act utilitarian would be led by their philosophy to do worse stuff than if they'd endorsed most other moral theories. After seeing the case of SBF, I now think the probability is y% instead, because our sample size is small enough that a single data point can be a large update."

Looks like Eliezer was similarly confused by your phrasing; your new argument ("almost no multibillion dollar frauds have ever happened, so we should do a very large update about the badness of everything that might have contributed to SBF defrauding people") sounds very different, and makes more sense to me, though I suspect it won't end up working.

5
RyanCarey
1y
I think you're right - I could have avoided some confusion if I said it could lead to "multi-billion-dollar-level bad consequences". Edited to clarify.
7
RobBensinger
1y
This seems to me like it's overstating the strength of evidence, as though FTX is a disproof rather than one data point among many. It is a disproof for extremely strong claims like "people who endorse act utilitarianism never do unethical things", but those claims should have had extremely low probability pre-FTX.
4
Holly_Elmore
1y
How much of an update is this really, though? Am I wrong that it's already the majority utilitarian view that act utilitarianism may be theoretically correct, but individual humans don't have the foresight to know the full consequences of every act and humans trying to work together need to be able to predict what others will do --> something like rule utilitarianism or observing constraints? Seems like the update should be about much you can know how things will turn out and whether you can get away with cutting corners. It does seem like Sam had pathological beliefs re:St. Petersburg paradox but that seems like more than wanting to maximize EV too much-- it's not caring about the longterm future (where everyone's inevitably dead after enough coin flips) enough. I really don't see how that can be attributed to act utilitarianism either.
2
RyanCarey
1y
I agree that most utilitarians already thought act utilitarianism as a decision procedure was bad. Still, it's important that more folks can see this, with higher confidence, so that this can be prevented from happening again. I think I agree that the St Petersburg paradox issue is orthogonal to choice of decision procedure (unless placing the bet requires engaging in a norm-violating activity like fraud).
2
Holly_Elmore
1y
Risking the entire earth seems like a norm violation to me
2
Milan_Griffes
1y
Here are some jumping-off points for reflecting on how one might update their moral philosophy given what we know so far. 
Sabs
1y29
13
5

idk, when people explicitly endorse your ideology as why they endorse "high leverage and double-or-nothing flips" I think it's at least worth taking a look at yourself. Now quite probably the person in question has misunderstood your ideology and doesn't understand why EAs do in fact care about the risk of ruin and why stealing money isn't ok, but then perhaps try to correct them?

Fwiw I think it very unlikely that the decision to use customer funds was a one-off decision made in 2022. My view is that that FTX was set up from the start to use customer money as a source of cheap capital for Alameda. In 2018 Alameda was offering potential investors a 15% guaranteed return on loans. It seems fairly likely that at some point SBF figured "fuck this, why are we offering these dorks 15% when we can just set up our own exchange and access huge amounts of capital at 0%". Never mind that the fact that privileged information from the exchange may well have opened up  for Alameda more ways to make money! 

The plan, imo, was always to accrue as much as wealth as possible as fast as possible with as few ethical constraints as possible. This worked for a while because Alameda's trades wer... (read more)

I mean he is a big sports fan, at least baseball, at least when he was younger. I got linked to his blog from 10 years ago from something, and the number one and two sets of posts were about baseball statistics.

The role of the EA movement in the case of FTX seems surely to meet the level of influence for some of the impact win's that EA has had so far here.

Perhaps most prominently, the movement:

  • Gave the idea of 'Earning to Give' to Sam
  • Provided a primary motivation to Sam and other FTX leadership to build the exchange

For example, when comparing to the case of Sendwave, the influence seems at least comparable and if not larger e.g. played a motivational role in founding a company, for the purpose of improving the world. (I'm not familiar with Wave's founders motivations, so could be wrong here)

In welfare terms alone, the impact of FTX's collapse on it's customers seems plausibly comparable to some of the impact win's of the movement to date. I.e. of the order of $1bn in lost funds. Given this, I think that an honest impact evaluation of the EA movement would include the harm caused to customers through FTX's collapse.

This is relevant not for blame assignment, but because it's very decision-relevant to EA's mission of improving the world. For example, when in the future deciding how much to emphasise harm avoidance when  encouraging the (good and novel) idea of Earning to Give.

[anonymous]1y20
5
0

I think that an honest impact evaluation of the EA movement would include the harm caused to customers through FTX's collapse.


Agreed. However:

In welfare terms alone, the impact of FTX's collapse on it's customers seems plausibly comparable to some of the impact win's of the movement to date. I.e. of the order of $1bn in lost funds.

Are you talking about welfare terms or financial terms? Because $1bn in lost savings of FTX customers seems very different in welfare terms to $1bn spent on bed-nets etc. I think there are strong reasons FTX shouldn't have acted the way it did, but suggesting these two things are comparable in welfare terms because they are similar in financial terms seems like an error to me.

1
callum
1y
Yeah I agree, I just mean that $1bn in funds lost to customers across the world is plausibly comparable in welfare terms to other wins on that list. E.g. dividing by 10 to account for differences in income of those affected, it would be around the amount attributed to GiveDirectly on the EA impact page. (without wanting to make a very direct crude comparison, or getting into the details of that) 
2[anonymous]1y
Okay yes, they may well be. I'm also pretty hesitant to attempt to make direct crude comparisons  - and I'll say again that I think there are strong reasons FTX shouldn't have acted as it did in addition to the direct harm to customers - but I'll just say that I seem to remember 100x or 1000x multipliers being more common than 10x in similar scenarios.

It is well-written, but I am not particularly convinced by the fantasy fiction analogy — it feels a lot more like “Here’s this very different situation, and you agree that the conclusions would be different. That would even be true if we modify it in several hard-to-imagine ways.”

In particular, I don’t see any reasonable analogies for:

  • EA’s “Earning to Give” career path, up to and including 80k featuring a profile on SBF as an exemplar.
  • The specific logic of “my marginal money is going to be donated” => “I should be closer to risk-neutral”, which I haven’t really seen rebutted on the facts (most instead argue that in reality, SBF/FTX/Alameda went too far and were risk-seeking).

That SBF ultimately contributed such a paltry amount of his apparent fortune is more impactful, but mainly as a reminder of how small and vulnerable EA actually is. It might very well be true that we didn’t mean that much to SBF, but he meant a lot to us.

2
demirev
1y
I also think this was a not-so-good and somewhat misleading analogy - the association between Novik and Caroline in the example is strictly one-way (Caroline likes Novik, Novik has no idea who Caroline is), whereas the association between FTX and EA is clearly two-way (e.g. various EA orgs endorsing and promoting SBF, SBF choosing to earn-to-give after talking with 80k etc).

I don’t [currently] view EA as particularly integral to the FTX story either. Usually, blaming ideology isn’t particularly fruitful because people can contort just about anything to suit their own agendas. It’s nearly impossible to prove causation, we can only gesture at it.

However, I’m nitpicking here but - is spending money on naming rights truly evidence that SBF wasn’t operating under a nightmare utilitarian EA playbook? It’s probably evidence that he wasn’t particularly good at EA, although one could argue it was the toll to further increase earnings to eventually give. It’s clearly an ego play but other real businesses buy naming rights too, for business(ish) reasons, and some of those aren’t frauds… right? 

I nitpick because I don't find it hard to believe that an EA could also 1) be selfish, 2) convince themselves that ends justify the means and 3) combine 1&2 into an incendiary cocktail of confused egotism and lumpy, uneven righteousness that ends up hurting  people. I’ve met EAs exactly like this, but fortunately they usually lack the charm, knowhow and/or resources required to make much of a dent. 

In general, I’m not surprised with the community's react... (read more)

The point there isn't so much, "He could not have had any EA thoughts in his head at all", which I doubt is really true - though also there could've just been pressure from coworkers, and office politics around it, resolving in something like the Future Fund so that they were doing anything.  My point is just that this nightmare is probably not one of a True Sincere Committed EA Act Utilitarian doing these things; that person would've tried to take more money off the table, earlier, for the Future Fund.  Needing an e-sports site named after your company - that's indeed something that other businesses do for business reasons; and if it feeds your business, that's real, that's urgent, that has to happen now.  The philanthropy side was evidently not like that.

3
samuel
1y
"My point is just that this nightmare is probably not one of a True Sincere Committed EA Act Utilitarian doing these things" - I agree that this is most likely true, but my point is that it's difficult to suss out the "real" EAs using the criteria listed. Many billionaires believe that the best course of philanthropic action is to continue accruing/investing money before giving it away.  Anyways, my point is more academic than practical, the FTX fraud seems pretty straight forward and I appreciate your take. I wonder if this forum would be having the same sorts of convos after Thanos snaps his fingers.

I still think that this incident should overall update most EAs in the direction of 1) ethical injunctions are important for humans and 2) more EAs should read the ethical injunctions section of the sequences. I agree that there is no system of ethics, or cultural movement, so awesome that it will stop its most loyal adherents from doing terrible things, but some do better than others. Nobody should feel guilty except for the people who committed the crime, but it would be great if EAs thought the right amount about how to lower the prob of events like this in the future, and that amount is not zero. 

I'm  also not sure how to square your advice about how I should relate to this incident with heroic responsibility.

Which ethical systems do you think have a better track record and why? Does virtue ethics, the preferred moral system of Catholics, have to take responsibility for pedophile priests? Does the rule-based ethics of deontology have to take responsibility for mass incarceration in the USA?

I can understand people claiming that this ethics implies that crazy conclusion, or assigning blame to an idea that seems clearly to have inspired a particular person to do a particular act. But I have no confidence that anybody on this earth has a clue about which ethical system is most or least disproportionately to blame for common-sense forms of good or bad behavior.

I think liberalism has a better track record than communism, for instance. No, but I do think Catholics should spend some time thinking about what's up with catholic priests molesting children, particularly if that catholic has any control over what goes on in the church. In general I do not think blaming this or that ethical system or social movement makes much sense, but noticing that the adherents of some social movement or ethical system tend to do some particular kind of bad thing more often than others can be useful, particularly if you are a part of that social movement. 

7
RobBensinger
1y
Ronny is talking about https://www.lesswrong.com/s/AmFb5xWbPWWQyQ244.
GideonF
1y12
18
11

There are however a number of things we ARE at fault for here.

  1. We as a community idolised SBF, including promoting him in many presentations, a relatively fawning interview by 80K which continued to promote the idea that SBF was living frugally (surely people knew by then that was bs). We could have chosen not to do this
  2. Will MacAskell made the introduction to Elon to try and get SBF to help buy twitter. We still have no public information why, but this would have given SBF more power and used a lot of money that could have been used on doing good to that end. Why?
  3. Carrick Flynn campaign; we as a community hugely supported this campaign which was quite blatantly SBF and GBF trying to buy a seat for their interests. Sure, we as a community thought this was also our interests (and I still assume Carrick would have done a good job?) but once again this was a way the community encouraged and didn't question SBFs power
  4. Will MacAskell knew SBF for 9 years, seemingly relatively closely. Its not Wills fault SBF committed fraud, but it is partially Wills fault SBF became such a face for the community within and outside of it. Maybe no ordinary person could have known SBF was a fraudster. B
... (read more)

There are however a number of things we ARE at fault for here.

Yes, assuming that these were foreseeably bad calls. Seems good to separately ask "what responsibility do EAs bear for Sam's bad decisions ?" and "what did we otherwise do wrong, or right?". E.g., if it were true that Sam would have made all the same missteps in the absence of EA, it could still be the case that we made Sam-related mistakes like "failing to propagate info about Sam's past bad behavior".

2. Will MacAskell made the introduction to Elon to try and get SBF to help buy twitter. We still have no public information why, but this would have given SBF more power and used a lot of money that could have been used on doing good to that end. 

It would have given SBF a different kind of power. I'm skeptical of the claim that SBF would be more powerful if he'd poured his money into Twitter, since that implies that Twitter is a more useful, leveraged thing to spend money on than SBF's other alternatives.

It seems more likely to me that either buying Twitter would reduce SBF's power/influence (because Twitter isn't very important), or that buying Twitter is a not-crazy sort of thing for EAs to try to do (because Twitte... (read more)

-5
GideonF
1y
Zvi
1y11
3
0

From an upcoming post I am drafting: I would point out that ‘heroes put the entire group, many innocent people, ‘the city,’ planet Earth or even the whole damn universe or multiverse in grave danger to save any main character or other thing that We Cannot Bear To Lose, because That’s What Heroes Do’ is ubiquitous in our fantasy media. It might be a majority of DC comics plots. Villains invoke it because decision theory, they know it will work, and even without that it is rather mind-bogglingly awful. That kind of thinking needs to be widely condemned and fall in status at least via What The Hell Hero moments, and I worry it has more influence in these situations than we think. 

First a disclaimer that I’ve never got anywhere close to interacting with SBF personally; I’m very much an outsider to this situation. However, from everything I have read, I think it’s pretty ridiculous to suggest that EA wasn’t the main reason SBF tried so hard to maximize profit (poorly, I might add, but it seems like that was his goal) to the point of committing fraud. As far as I understand EA was SBF’s primary guiding ideology; it is why he went down this career path of Jane Street and then starting his own companies. This post seems overly reliant on the fun fact that SBF paid more for e-sports naming rights than on EA donations to show why actually Sam didn’t care about EA that much. But these are two completely separate things! E-sports naming rights is just a means of advertising, with the goal of making FTX more money which will eventually allow SBF to donate more to EA. I think there’s also decent evidence that SBF was looking to ramp up donations in the future, as Effective Altruism continues to grow and is able to use more funding. Once you take out this fun fact about SBF’s current EA spending, I think this whole argument kind of falls apart.

[This comment is no longer endorsed by its author]Reply
6
RobBensinger
1y
Seems like a reasonable objection to me. (Though it's still weird that SBF overpaid so much for that particular form of advertising; and it's weird that SBF didn't set aside money for FTX FF.)

I like this post much more than your previous post.

Is there a source for the $140M figure?

5
Douglas Knight
1y
My guess is that this is the June figure for the FTX Future Fund grant commitments. The current figure is $160M as of September 1st. Some of these grants were in installments, especially the multi-year ones, and not all of the money was transferred. This Fund was "longtermist" and I do not see a dollar figure on other FTX charitable giving. This does not include $500M in equity in Anthropic. Added, weeks later: Or maybe he got it from NYT: which seems to be sourced from NYT a month ago: I suspect that these numbers are actually delivered, not promises. My guess is that the Future Fund pledged $190 million, 160 directly and 30 through regranting, delivered 100 and failed to deliver $90 million (a). (Plus $50 million not through the Future Fund, at least some of which counts as EA.)
2
Greg_Colbourn
1y
Thanks, there is also $32M from the regrants tab. But yes, difficult to know the actual total of payouts without word from the staff. Or payouts not subject to clawback without further details on legal proceedings.
0
Greg_Colbourn
1y
Kind of ironic that they were "longtermist" about the world, but not about their own existence!

The question that heads this post obviously answers itself, in that only actual perpetrators  of bad deeds and their direct instigators (intellectual or otherwise) are to be held accountable for them; nevertheless, I must admit that I found Eliezer Yudkowsky' analogy unconvincing, and  (not quite, but feeling a little bit) disingenuous. Whenever we see examples of adherents of some creed, ideology, religious or thought system going into nefarious places, it is natural to wonder if said ideas (whether properly or mistakenly interpreted) influenced... (read more)

When comparing the size of SBF/FTX outlay on EA vs. stuff like naming rights, I think it is important to compare apples to apples. As far as the victim's perspective, the key question is "how much money went out the door" as opposed to "how much did SBF/FTX plan or commit to spend in the future?" Although I don't know how the naming rights deals were set up, I suspect that much of the money was to be paid in the future. That means the stadiums, teams, etc. are now general unsecured creditors on any claims. I am hearing that depositor claims may be valued on the distressed-debt market at 3-5 cents on the dollar, so the claims of naming-rights counterparties are likely worth even less.

3
EliezerYudkowsky
1y
Fair point.

In case anyone wants a reference for the $210 million that FTX committed to spend on esports naming rights for TSM, a  Washington Post article from today is here

I'm very new here, just signed in today so I'm unfamiliar with all the formats at the point but I often seek to explain how biological factors can play a role in morality or our decision making because it can be useful to understand our brain's limitations among all the other factors.

Stress, isolation and the position of power have consequences for the brain. The less cooperative one has to function within their society to leads to damage in areas of the brain responsible for decision making, the anterior cingulate cortex being the most key area. It breaks... (read more)

I agree that EA likely wasn't a major causal factor for FTX/SBF's likely fraud. Unfortunately, it's a situation where even if it's not our fault it is our problem. People are trashing EA across the internet because of Sam's position in the movement. His Twitter profile pic still has him wearing an EA shirt for christ sake! 

So are people who never attacked EA before suddenly doing so? That isn't what I've seen. I've seen lots of bad-faith takes about how this is proof of what they always thought, and news reporting which is about as accurate as you'd expect - that is, barely correct on the knowable facts, and misleading or confused about anything more complicated than that.

3
Justin Helps
1y
EA is a brand, and people on the outside don't have much information about it, so a negative association matters on the margin for recruiting. The main post makes a fair point about not going overboard with self blame, but it seems good for EA folks to be publicly concerned about how they could have acted better, or to publicly discuss the lessons they're taking. At the very least, I don't think it's worth much effort to stop people from doing so.

Can you imagine how you'd judge it if, rather than my writing it as a joke, Naomi Novik had gone online and sincerely tried to accept blame for FTX's fall, because she thought she hadn't been careful enough to put messages about good corporate governance and careful accounting into her fantasy novels, and Novik had talked about how she was planning to donate an appropriate portion of her Scholomance book royalties back to FTX's ruined customers?

Even so, I'm still recommending people to read Terry Pratchett instead of Novik. Something something low probabil... (read more)

We can infer that probably at least $30 of Scholomance sales are due to Caroline Ellison, and with the resources that Ellison commanded as co-CEO of Alameda

C'mon, if she's a true maximizer using depositors money, I guess she'd just download it from z-library
OMG, is this why z-lib was recently seized by FBI? 

Poor accounting, possibly just no really global accounting or sense of where the money was going;

I chatted with an Alameda python dev for about an hour. I tried to get a sense of their testing culture, QA practices, etc. Lmao: there didn't seem to be any. Soups of scripts, no time for tests, no internal audits. Just my impression. 

My type-driven and property-based testing zealot/pedant side has harvested some bayes points, unfortunately. 

If there's anyone other than FTX who's really to blame, here, it's me.  I've written some fiction that tries to walk people through the experience of abandoning sunk costs and facing reality.  Including my most recent work.

Caroline Ellison, according to her tumblr, had even started reading it...

But her liveblogs cut out before she got very far in.

I just wasn't a good-enough writer; I lost my reader's attention, and with it, perhaps, the world.

 

We do not know her absolute state of mind when FTX (mis)used customer deposits. But, for all its wo... (read more)

I was not being serious there.  It was meant to show - see, I could blame myself too, if I wanted to be silly; now don't be that silly.

9
Davidmanheim
1y
I think you probably need to label your account "EliezerYudkowsky (parody)" because otherwise a few people might not realize you're occasionally being sarcastic, and then you might get banned from Twitter.

(Be it clear, I'm not analogizing myself to Novik in that metaphor.  I'm analogizing Peter Singer and classical Givewell-style EA to Novik.  I asked SBF if he wanted to meet with me ever, he never got around to it, I do not think he was a Yudkowsky fan and he hung out with some EAs who definitely weren't.)

 

Caroline Ellison is the disgraced and probably criminally responsible CEO of Alameda, involved in FTX’s downfall.

Despite Yudhowsky's citing of Peter Singer, almost none of SBF's FTX FF money went to Peter Singer’s causes of global poverty ... (read more)

I was passing through the Bahamas and asked if FTX wanted me to talk to the EAs they had on fellowships there.  They paid for my hotel room and an Airbnb when the hotel got full, for a week.  I'm not sure but I don't think I remember getting to see SBF at all while I was at the hotel.  Didn't go swimming or sunning or any such because I am not a very outdoors person.  It does not seem entirely accurate to characterize this as "was hosted by SBF in the Bahamas".

The Future Fund basically turned down all my ideas until the regrantor program started; I made two recommendations and I expect neither of them will pay out now unless they moved very fast.

Unless I specifically defend an idea, I think that a lot of what gets said in the San Francisco Bay Area is also not something I'd accept as my fault.  Eg there was a lot of drug use involved in this going wrong, which I'm sure did not start from me, and I've suggested increasingly loudly and openly of late that people cut back on the drug use; maybe it's Bay-associated idk, but it sure is not Yudkowsky-endorsed.

I did think Will MacAskill was from the Singer side of things, so I admit to being surprised if the highly-legible side of effective altruism got nothing, unless it was a room-for-more-funding issue with Givewell+OpenPhil having already snapped up all the fruit hanging lower than GiveDirectly.  I will consider myself tentatively corrected on that point unless I hear otherwise or have investigated.

2
Tyler M
1y
Yudkowsky wrote this above.    It would be wild to see anyone defend or explain the terms "Singer side" or "twisting people's brains" in this context, much less the intentional act implied. This is a flat out attack that uses ideas and sentiment from actual criticisms of MIRI/LW, which I do not cite, because it is inflammatory. This is likely to preempt anticipated future criticism using these arguments.
-3
Tyler M
1y
I am writing here because the EA community should know now, that sentiment in global health and poverty, and animal welfare, is extremely low, especially among limited talent. As EAs know, the FTX money favored longtermist causes. In the aftermath of the FTX collapse, EA is globally harmed, further disadvantaging these causes already in the shadow of this money.  The departure of this talent could be a wholesale disaster for EA, and leave it in a permanent weakened state. It is not being discussed, like dangers, such as the risk of FTX, due to the dynamics of EA discourse, which is easily dominated by full time influencers like Yudhowsky. In this vulnerable state, undue attempts to associate Peter Singer, "EA", and undue attempts to dissasociate "LW" and "rationality", are an incredibly uncooperative defection.
-20
Tyler M
1y
-5
Tyler M
1y
Curated and popular this week
Relevant opportunities