Caroline Ellison, co-CEO and later CEO of Alameda, had a now-deleted blog, "worldoptimization" on Tumblr. One does not usually post excerpts from deleted blogs - the Internet has, of course, saved it by now - but it looks like Caroline violated enough deontology to be less protected than usual in turn, and also I think it's important for people to see what signals are apparently not reliable signs of honesty and goodness.
In a post on Oct 10 2022, Caroline Ellison crossposted her Goodreads review of The Golden Enclaves, book 3 of Scholomance by Naomi Novik. Caroline Ellison writes, including very light / abstract spoilers only:
A pretty good conclusion to the series.
Biggest pro was the resolution of mysteries/open questions from the first two books. It wrapped everything up in a way that felt very satisfying.
Biggest con was … I think I felt less bought into the ethics of the story than I had for the previous two books?
The first two books often have a vibe of “you can either do the thing that’s easy and safe or you can do the thing that’s hard and scary but right, and being a good person is doing the right thing.” And I’m super on board with that.
Whereas if I had to sum up the moral message of the third book I might go with “there is no ethical consumption under late capitalism.”
For someone like myself, this is a pretty shocking thing to hear somebody say, on a Tumblr blog not then associated with their main corporate persona, not in a way that sounds like the usual performativity, not like it's meant to impress anybody (because then you're probably not writing about anything as undignified as fantasy fiction in the first place). It sounds like - Caroline might have been under the impression, as late as Oct 10, that what she was doing at FTX was the thing that's hard and scary but right? That she was doing, even, what Naomi Novik would have told her to do?
The Scholomance novels feature a protagonist, Galadriel Higgins, with unusually dark and scary powers, with a dark and scary prophecy about herself, trying to do the right thing anyways and being misinterpreted by her classmates, in an incredibly hostile environment.
The line of causality seems clear - Naomi Novik, by telling her readers to do the right thing, probably contributed to Caroline Ellison doing what she thought was the right thing - misusing Alameda's customer deposits. Furthermore, the Scholomance novels romanticized people with dark and scary powers, and those people not just immediately killing themselves in the face of a prophecy that they'd do immense harm later, i.e., sending the message that it's okay for them to take huge risks with other people's interests.
I expect this to be a very serious blow to Naomi Novik's reputation, possibly the reputation of fantasy fiction in general. The now-deleted Tumblr post is tantamount to a declaration that Caroline Ellison was doing this because she thought Naomi Novik told her to. We can infer that probably at least $30 of Scholomance sales are due to Caroline Ellison, and with the resources that Ellison commanded as co-CEO of Alameda, some unknown other fraction of Scholomance's entire revenues could have been due to phantom purchases that Ellison funded in order to channel customer deposits to her favorite author.
My moral here? It can also be summed up in an old joke that goes as follows: "He has no right to make himself that small; he is not that great."
The best summary of the FTX affair that I've read so is Milky Eggs's "What Happened at Alameda Research?" If you haven't read it already, and you're at all interested in this affair, I recommend that you go read it right now.
Pieced together from various sources, including some allegedly shared from FTX-employees (and including some comments posted by those to the Effective Altruism forum), Milky Eggs pieces together a harrowing story of how Alameda Research probably lost in excess of $15 billion dollars. Primary causative factors:
- Their actual arb strategies stopped working, and were frog-boilingly gradually replaced with long bets on crypto that paid out during the boom and exploded during the bust;
- Poor accounting, possibly just no really global accounting or sense of where the money was going;
- Excessive use of stimulants, including those known to result in compulsive gambling behavior;
- A corporate acquisitions spree, possibly partially motivated by buying up corporate entities that held the FTT token and could have taked the market by dumping it, maybe even raiding those companies for their own customer deposits;
- A general lack of spending discipline: for example, buying naming rights to the e-sports organization TSM for $210M, which was way out of line to comparable deals in e-sports.
Completely missing from Milky Eggs's account: Any mention of effective altruism, except that the EA Forum is listed as a source for some of their alleged-ex-FTX-employee accounts.
Because - and I say this meaning it gently, and with kindness - you were not that fucking important.
The amount that FTX spent on e-sports naming rights for TSM was greater than everything they donated to effective altruism.
Can you imagine how you'd judge it if, rather than my writing it as a joke, Naomi Novik had gone online and sincerely tried to accept blame for FTX's fall, because she thought she hadn't been careful enough to put messages about good corporate governance and careful accounting into her fantasy novels, and Novik had talked about how she was planning to donate an appropriate portion of her Scholomance book royalties back to FTX's ruined customers? Depending on her state of mind, you might either try to gently console her and somehow get her to realize that she was being way too scrupulous and might possibly want to try standard meds for OCD at some point; or, on another hypothesis about Novik's state of mind, you might try to gently explain that she's not the center of the universe and that this wasn't mostly about her.
This would be true even if Sam Bankman-Fried himself had presented as a Naomi Novik fan, if he had told others that he wanted to be a Novik-style DoTheRightThingist just like Galadriel Higgins the Scholomance protagonist, and he had funneled $140M to causes having to do with things that were on-theme for some of Novik's books. The $140M would still be less than FTX had spent on e-sports naming rights. SBF calling himself a Novikian RightThingist would not have been much of a factor in why he was trusted, compared to their claims of being the first GAAP-audited crypto exchange and so on.
There probably would be some sort of weird blowup in the Novik fandom, in that case, it would make more sense for them to wonder if they were responsible. But I'd expect people in the Novik fandom to also vastly overestimate how much it was all about them, in that case; because they would know all about Novik, but have less daily exposure to the much wider world in which FTX operated. They'd have heard about the money donated to RightThingism but not about the e-sports naming rights. They would not realize that there were other and bigger fish in the pond.
(Be it clear, I'm not analogizing myself to Novik in that metaphor. I'm analogizing Peter Singer and classical Givewell-style EA to Novik. I asked SBF if he wanted to meet with me ever, he never got around to it, I do not think he was a Yudkowsky fan and he hung out with some EAs who definitely weren't.)
(ADDED: I am not saying that EA influence on Alameda was comparable in magnitude to Novik's influence on Caroline Ellison; I am giving an example of the mental motion of trying to grab too much responsibility because you don't know about all the parts of the universe that aren't yourself.)
It wouldn't, even, reflect all that badly on the spirit running across many fantasy novels of RightThingism. Not just because "no true Scotsman", not even because SBF would have really actually missed the point of fantasy-novel RightThingism. But because the amount that FTX spent on e-sports naming rights vs the amount they gave to RightThingist causes, and how they didn't take a billion off the table for RightThingism while they still had a billion, maybe belied a bit the idea that RightThingism was in fact that central to their mental lives.
Also Milky Eggs's account says that FTX's own employees were encouraged to keep all their salaries on the exchange, which... I don't really have words. It's not - what you'd expect somebody to do if they still had even fantasy-novel RightThingism inside them. The Milky Eggs account says that Caroline Ellison was one of four FTX employees who knew. I wish I had a reliable printout of what Caroline Ellison was actually thinking at the time she wrote that Tumblr post. I would bet that, even without the benefit of hindsight on how it turned out, Naomi Novik wouldn't have agreed with it at the time.
And whatever Caroline Ellison was thinking when she wrote that, it is obvious - when you look at it from safely outside - that it wasn't Naomi Novik's fault.
If Caroline Ellison had worn a Naomi Novik T-shirt and put the Scholomance books in her Twitter profile and told her crypto clients "Trust me, I read fantasy novels and I know what the Right Thing is," it would still not have been Naomi Novik's fault.
It wouldn't have been the fault of the abstract concept of “you can either do the thing that’s easy and safe or you can do the thing that’s hard and scary but right, and being a good person is doing the right thing”. Plenty of people have read fantasy novels like that and not wrecked depository institutions. Not just in terms of moral responsibility, but actual causality, I'd be surprised if that was really in actual fact a key driver in the decisions that Caroline Ellison made; maybe she used that to rationalize that afterwards, but I doubt it's what was going through her mind on the fatal day that FTX used customer deposits to pay back Alameda creditors (if that's in fact when FTX first touched customer deposits). Pride did it, I'd sooner guess, or the desire to not not not be in this universe going so badly and taking the only step that preserved the feeling that everything could still be okay.
Who's at fault for FTX's wrongdoing?
Ask a simple question, get a simple answer.
You have no right to blame yourself any more than that. You weren't that important.
If there's anyone other than FTX who's really to blame, here, it's me. I've written some fiction that tries to walk people through the experience of abandoning sunk costs and facing reality. Including my most recent work.
Caroline Ellison, according to her tumblr, had even started reading it...
But her liveblogs cut out before she got very far in.
I just wasn't a good-enough writer; I lost my reader's attention, and with it, perhaps, the world.
Now, some people might say here: "But Eliezer, aren't you co-writing that story with another author?" And to this I can only reply: I see no reason why the existence of any other people in the universe ought to detract from my own sole accountability for everything that anyone does inside it.
What about the parts of EA that isn't Peter Singer and classical GiveWell-style EA? If those parts of EA were somewhat responsible, would it be reasonable to call that EA as well?
I don't think the analogy is helpful. Naomi Novik presumably does not claim to emphasize the importance of understanding tail risks. Naomi presumably didn't meet Caroline and encourage her to earn a lot of money so she can donate to fantasy authors, nor did Caroline say "I'm earning all of this money so I can fund Naomi Novik's fantasy writing". Naomi Novik did not have Caroline on her website as a success story of "this is why you should earn money to buy fantasy books or support other fantasy writers". Naomi didn't have a "Fantasy writer's fund" with the FTX brand on it.
I think it's reasonable to preach patience if you think people are jumping too quickly to blame themselves. I think it's reasonable to think that EA is actually less responsible than the current state of discourse on the forum. And I'm not making a claim about the extent EA is in fact responsible for the events. But the analogy as written is pretty poor, and... (read more)
I agree that if I, personally, had steered SBF into crypto, and uncharacteristically failed to add on a lot of "hey but please don't scam people, only do this if you find a kind of crypto you can feel good about" I might consider myself more at fault. I even think that the Singer side of EA in fact does less talking about deontology, less writing of fiction that exemplifies the feelings and reasoning behind that deontology, less cautioning of people against twisting up their brains by chasing good ideas; on my view, the Singer side explicitly starts by trying to twist people's brains up internally, and at some point we should all maybe have a conversation about that.
The thing is, if you want to be sane about this sort of thing, even so and regardless I think Peter Singer himself would not have approved this, would obviously not have approved this. When somebody goes that far off the rails, I just don't see how you could reasonably hold responsible people who didn't tell them to do that and would've obviously not wanted them to do that.
Given how big of a role EA apparently had in the origin of Alameda (Singh says in the Sequoia puff piece that it wouldn’t have started without EA), there very likely are many members of the community who offered more encouragement and/or didn’t give as many warnings as they should have.
I don’t know what point that fault transcends the individual and attributes to the community, but at the very least, adding up other individuals’ culpabilities in steering SBF to crypto without appropriate caution would seem to put a lot of the blame you say you personally avoid on EA as a whole.
DM conversation I had with Eliezer in response to this post. Since it was a private convo and I was writing quickly I had somewhat exaggerated in a few places that I've now indicated with edits.... (read more)
Commending Habryka for willing to share about these things. It takes courage and I think reflections/discussions like this could be really valuable (perhaps essential) to the EA community having the kind of reckoning about FTX that we need.
Great points, all. Even if most people could do nothing and Sam was not motivated by a core problem with EA philosophy, that doesn’t mean there was nothing that EAs close to the situation could have done differently. I would love to see a public airing of what genuine evidence people think they might have had that should have changed those people’s behavior around Sam.
Assuming that this means that the FTX leadership is friends with prominent EAs, I think that this fact raises some questions that many people might consider important.
For instance, I think some people might find it important to know what those friends have been doing with respect to this situation for the past week. What sort of communication have they had with the FTX leadership? Do they still feel loyalty toward SBF/Caroline/etc.? Are they in any way aiding or abetting them to commit crimes or avoid the legal or reputational consequences of their actions?
These might be dumb questions, and I apologize if so. They occurred to me because I model people as being quite likely to aid and abet with their close friends' criminal or malicious activity, but I acknowledge that that model could be wrong and/or not very applicable to this situation.
Here are some excerpts from Sequoia Capital's profile on SBF (published September 2022, now pulled).
On career choice:... (read more)
Some corrections of the Sequoia info:
But mostly, if https://forum.effectivealtruism.org/posts/xafpj3on76uRDoBja/the-ftx-future-fund-team-has-resigned-1?commentId=hpP8EjEt9zTmWKFRy is accurate, I'm bummed that the money I helped earn was squandered right away.
Definitely: you are obviously right and Eliezer obviously wrong about this, imho.
I do think it is hindsight bias to some degree to think that "EA" as a collective or Will MacAskill as an individual are recorded as doing something wrong, in the sense of "predictably a bad idea" at any point in the passages you quote. (I know you didn't actually claim that!) It's not immoral to tell some to found a business, so it's definitely not immoral to tell someone to found a business and give to charity. It's not immoral to help someone make a legal, non-scammy trade, as the anonymous Japanese EA apparently did ("buy low and sell high" is not poor business ethics as far as I know, though I'm prepared to be corrected about that by someone who actually knows finance.) It's a bit more controversial to say it's not wrong to take very rich people's money to do the sort of work EA charities do, but it's certainly not obvious that it is, and nothing in the quoted passages actually shows that any individual had evidence that FTX were a bad org to be associated with. (They may well have, I'm not saying no one did wrong, I'm just saying no wrong-doing is suggested by the information quoted here... (read more)
Thanks for this comment.
I'm more interested in reflecting on the foundational issues in EA-style thinking that contributed to the FTX debacle than in abscribing wrongdoing or immorality (though I agree that the whole episode should be thoroughly investigated).
Examples of foundational issues:
... (read more)
- FTX was an explicitly maximalist project, and maximization is perilous
- Following a utilitarian logic, FTX/Alameda pursued a high-leverage strategy (Caroline on leverage); the decision to pursue this strategy didn't account for the massive externalities that resulted from its failure
- The Future Fund failed to identify an existential risk to its own operation, which casts doubt on their/our ability to perform risk assessment
- EA's inability and/or unwillingness to vet FTX's operations (lack of financial controls, lack of board oversight, no ring-fence around funds committed to the Future Fund) and SBF's history of questionable leadership points to overeager power-seeking
- MacAskill's attempt to broker an SBF <> Elon deal re: purchasing Twitter also points to overeager power-seeking
- Consequentialism straightforwardly implies that the ends justif
The point is not "EA did as little to shape Alameda as Novik did to shape Alameda" but "here is an example of the mental motion of trying to grab too much responsibility for yourself".
This seems to be a false equivalence. There's a big difference between asking "did this writer, who wrote a bit about ethics and this person read, influence this person?" vs "did this philosophy and social movement, which focuses on ethics and this person explicitly said they were inspired by, influence this person?"
I agree with you that the question
has the answer
But the question
Is nevertheless sensible and cannot have the answer FTX.
Couldn't agree more strongly.
The inferential jump from someone reading a book in their spare time, making a pretty superficial Goodreads review about a main takeaway, to
Is a pretty big one, and kinda egregious honestly.
Agree, we shouldn't give a pass to irrational (frankly, egocentric) thinking just because it feels like taking responsibility.
I feel especially irritated with people who are ready to change their entire utilitarian philosophy just because someone associated with ours (probably) committed a major crime and got caught, as if they didn't understand last week that they lived in a world where surprises like that can happen. I don't understand how else they could update their moral philosophy so fast based on the info we have.
Maybe they weren't familiar with the overwhelming volume of previous historical incidents, hadn't had their brains process history or the news as real events rather than mythology, or were genuinely unsure about how often these sorts of things happened in real life rather than becoming available on the news. I'm guessing #2.
I agree that this is pretty weird. There were presumably a bunch of historical contingencies that went into whether the FTX implosion occurred; it seems weird if we should endorse some moral philosophy X in the world where all those contingencies occurred, and some different moral philosophy Y in the world where not all of those contingencies occurred.
And it also seems weird if we should endorse the same moral philosophy in both worlds, but this one data point -- an important data point EV-wise, but still a single event, historical contingency and all -- is crucial evidence about such a high-level proposition. Evidence that we somehow didn't acquire via looking at the entirety of human history, the entire psychology and sociology literature, etc.
The least-weird versions of this update I can imagine are:
... (read more)
- "This isn't a large update about high-level questions like that, but it's at least an interesting case study. We shouldn't treat it as a huge deal evidentially, but having a Schelling case study
I think it’s very worth reflecting on strategic decisions that were made around Sam. I just don’t think what happened is very significant to whether utilitarianism is the correct moral philosophy.
I think part of the story here is the a weird status dynamic where...
1. I would basically trust some people to try the explicit direct utilitarian thing: eg I think it is fine for Derek Parfit or Toby Ord.
2. This creates some weird correlation where the better you are on some combination of (smartness/understanding of ethics/power in modelling the world), the more you can try to be actually guided by consequences
3. This can make being 'hardcore' consequentialist ...sort of cool and "what the top people do"
4. ... which is a setup where people can start goodhart/signal on it
Yeah, I think it's a severe problem that if you are good at decision theory you can in fact validly grab big old chunks of deontology directly out of consequentialism including lots of the cautionary parts, or to put it perhaps a bit more sharply, a coherent superintelligence with a nice utility function does not in fact need deontology; and if you tell that to a certain kind of person they will in fact decide that they'd be cooler if they were superintelligences so they must be really skillful at deriving deontology from decision theory and therefore they can discard the deontology and just do what the decision theory does. I'm not sure how to handle this; I think that the concept of "cognitohazard" gets vastly overplayed around here, but there's still true facts that cause a certain kind of person to predictably get their brain stuck on them, and this could plausibly be one of them. It's also too important of a fact (eg to alignment) for "keep it completely secret" to be a plausible option either.
I completely agree that a motivated person could easily believe that any decision is the right act utilitarian decision because there aren't clear rules for determining the right act utilitarian decision and checking your answer. Totally.
But idk if it's even fair to say Sam was using act utilitarianism as a decision procedure. It's not clear to me if he even believed that while he was (allegedly) committing the fraud.
Not trying to take this out on you, but I'm annoyed by how much all this advocacy of deontology all of sudden overlaps with covering our own asses. I don't buy it as a massive update about morality or psychology from the events themselves but a massive update about optics.
Reposting from twitter: It's a moderate update on the prevalence of naive utilitarians among EAs.
Classical problem with this debate on utilitarianism is the vocabulary used makes motte-and-bailey defense of utilitarianism too easy.
1. Someone points to a bunch problems with a act consequentialist decision procedure / cases where naive consequentialism tells you to do bad things
2. The default response is "but this is naive consequentialism, no one actually does that"
3. You may wonder that while people don't advocate for or self-identify as naive utilitarians ... they actually make the mistakes
The case provides some evidence that the problems can actually happen in practice in important enough situations to care. [*]
Also, you have the problem that sophisticated naive consequentialists could be tempted to lie to you about their morality ("no worries, you can trust me, I'm following the sensible deontic constraints!"). Personally, before the recent FTX happenings, I would be more of the opinion "nah, this sounds too much like an example from a philosophical paper, unlikely with typical human psy... (read more)
I'd agree with this statement more if it acknowledged the extent to which most human minds have the kind of propositional separation between "morality" and "optics" that obtained financially between FTX and Alameda.
This strikes me as a bad play of "if there was even a chance". Is there any cognitive procedure on Earth that passes the standard of "Nobody ever might have been using this cognitive procedure at the time they made $mistake?" That more than three human beings have ever used? I think when we're casting this kind of shade we ought to be pretty darned sure, preferably in the form of prior documentation that we think was honest, about what thought process was going on at the time.
Why require surety, when we can reason statistically? There've been maybe ten comparably-sized frauds ever, so on expectation, hardline act utilitarians like Sam have been responsible for 5% of the worst frauds, while they represent maybe 1/50M of the world's population (based on what I know of his views 5-10yrs ago). So we get a risk ratio of about a million to 1, more than enough to worry about.
Anyway, perhaps it's not worth arguing, since it might become clearer over time what his philosophical commitments were.
Looks like Eliezer was similarly confused by your phrasing; your new argument ("almost no multibillion dollar frauds have ever happened, so we should do a very large update about the badness of everything that might have contributed to SBF defrauding people") sounds very different, and makes more sense to me, though I suspect it won't end up working.
idk, when people explicitly endorse your ideology as why they endorse "high leverage and double-or-nothing flips" I think it's at least worth taking a look at yourself. Now quite probably the person in question has misunderstood your ideology and doesn't understand why EAs do in fact care about the risk of ruin and why stealing money isn't ok, but then perhaps try to correct them?
Fwiw I think it very unlikely that the decision to use customer funds was a one-off decision made in 2022. My view is that that FTX was set up from the start to use customer money as a source of cheap capital for Alameda. In 2018 Alameda was offering potential investors a 15% guaranteed return on loans. It seems fairly likely that at some point SBF figured "fuck this, why are we offering these dorks 15% when we can just set up our own exchange and access huge amounts of capital at 0%". Never mind that the fact that privileged information from the exchange may well have opened up for Alameda more ways to make money!
The plan, imo, was always to accrue as much as wealth as possible as fast as possible with as few ethical constraints as possible. This worked for a while because Alameda's trades wer... (read more)
I mean he is a big sports fan, at least baseball, at least when he was younger. I got linked to his blog from 10 years ago from something, and the number one and two sets of posts were about baseball statistics.
The role of the EA movement in the case of FTX seems surely to meet the level of influence for some of the impact win's that EA has had so far here.
Perhaps most prominently, the movement:
For example, when comparing to the case of Sendwave, the influence seems at least comparable and if not larger e.g. played a motivational role in founding a company, for the purpose of improving the world. (I'm not familiar with Wave's founders motivations, so could be wrong here)
In welfare terms alone, the impact of FTX's collapse on it's customers seems plausibly comparable to some of the impact win's of the movement to date. I.e. of the order of $1bn in lost funds. Given this, I think that an honest impact evaluation of the EA movement would include the harm caused to customers through FTX's collapse.
This is relevant not for blame assignment, but because it's very decision-relevant to EA's mission of improving the world. For example, when in the future deciding how much to emphasise harm avoidance when encouraging the (good and novel) idea of Earning to Give.
Are you talking about welfare terms or financial terms? Because $1bn in lost savings of FTX customers seems very different in welfare terms to $1bn spent on bed-nets etc. I think there are strong reasons FTX shouldn't have acted the way it did, but suggesting these two things are comparable in welfare terms because they are similar in financial terms seems like an error to me.
It is well-written, but I am not particularly convinced by the fantasy fiction analogy — it feels a lot more like “Here’s this very different situation, and you agree that the conclusions would be different. That would even be true if we modify it in several hard-to-imagine ways.”
In particular, I don’t see any reasonable analogies for:
That SBF ultimately contributed such a paltry amount of his apparent fortune is more impactful, but mainly as a reminder of how small and vulnerable EA actually is. It might very well be true that we didn’t mean that much to SBF, but he meant a lot to us.
I don’t [currently] view EA as particularly integral to the FTX story either. Usually, blaming ideology isn’t particularly fruitful because people can contort just about anything to suit their own agendas. It’s nearly impossible to prove causation, we can only gesture at it.
However, I’m nitpicking here but - is spending money on naming rights truly evidence that SBF wasn’t operating under a nightmare utilitarian EA playbook? It’s probably evidence that he wasn’t particularly good at EA, although one could argue it was the toll to further increase earnings to eventually give. It’s clearly an ego play but other real businesses buy naming rights too, for business(ish) reasons, and some of those aren’t frauds… right?
I nitpick because I don't find it hard to believe that an EA could also 1) be selfish, 2) convince themselves that ends justify the means and 3) combine 1&2 into an incendiary cocktail of confused egotism and lumpy, uneven righteousness that ends up hurting people. I’ve met EAs exactly like this, but fortunately they usually lack the charm, knowhow and/or resources required to make much of a dent.
In general, I’m not surprised with the community's react... (read more)
The point there isn't so much, "He could not have had any EA thoughts in his head at all", which I doubt is really true - though also there could've just been pressure from coworkers, and office politics around it, resolving in something like the Future Fund so that they were doing anything. My point is just that this nightmare is probably not one of a True Sincere Committed EA Act Utilitarian doing these things; that person would've tried to take more money off the table, earlier, for the Future Fund. Needing an e-sports site named after your company - that's indeed something that other businesses do for business reasons; and if it feeds your business, that's real, that's urgent, that has to happen now. The philanthropy side was evidently not like that.
I still think that this incident should overall update most EAs in the direction of 1) ethical injunctions are important for humans and 2) more EAs should read the ethical injunctions section of the sequences. I agree that there is no system of ethics, or cultural movement, so awesome that it will stop its most loyal adherents from doing terrible things, but some do better than others. Nobody should feel guilty except for the people who committed the crime, but it would be great if EAs thought the right amount about how to lower the prob of events like this in the future, and that amount is not zero.
I'm also not sure how to square your advice about how I should relate to this incident with heroic responsibility.
Which ethical systems do you think have a better track record and why? Does virtue ethics, the preferred moral system of Catholics, have to take responsibility for pedophile priests? Does the rule-based ethics of deontology have to take responsibility for mass incarceration in the USA?
I can understand people claiming that this ethics implies that crazy conclusion, or assigning blame to an idea that seems clearly to have inspired a particular person to do a particular act. But I have no confidence that anybody on this earth has a clue about which ethical system is most or least disproportionately to blame for common-sense forms of good or bad behavior.
I think liberalism has a better track record than communism, for instance. No, but I do think Catholics should spend some time thinking about what's up with catholic priests molesting children, particularly if that catholic has any control over what goes on in the church. In general I do not think blaming this or that ethical system or social movement makes much sense, but noticing that the adherents of some social movement or ethical system tend to do some particular kind of bad thing more often than others can be useful, particularly if you are a part of that social movement.
There are however a number of things we ARE at fault for here.
Yes, assuming that these were foreseeably bad calls. Seems good to separately ask "what responsibility do EAs bear for Sam's bad decisions ?" and "what did we otherwise do wrong, or right?". E.g., if it were true that Sam would have made all the same missteps in the absence of EA, it could still be the case that we made Sam-related mistakes like "failing to propagate info about Sam's past bad behavior".
It would have given SBF a different kind of power. I'm skeptical of the claim that SBF would be more powerful if he'd poured his money into Twitter, since that implies that Twitter is a more useful, leveraged thing to spend money on than SBF's other alternatives.
It seems more likely to me that either buying Twitter would reduce SBF's power/influence (because Twitter isn't very important), or that buying Twitter is a not-crazy sort of thing for EAs to try to do (because Twitte... (read more)
From an upcoming post I am drafting: I would point out that ‘heroes put the entire group, many innocent people, ‘the city,’ planet Earth or even the whole damn universe or multiverse in grave danger to save any main character or other thing that We Cannot Bear To Lose, because That’s What Heroes Do’ is ubiquitous in our fantasy media. It might be a majority of DC comics plots. Villains invoke it because decision theory, they know it will work, and even without that it is rather mind-bogglingly awful. That kind of thinking needs to be widely condemned and fall in status at least via What The Hell Hero moments, and I worry it has more influence in these situations than we think.
First a disclaimer that I’ve never got anywhere close to interacting with SBF personally; I’m very much an outsider to this situation. However, from everything I have read, I think it’s pretty ridiculous to suggest that EA wasn’t the main reason SBF tried so hard to maximize profit (poorly, I might add, but it seems like that was his goal) to the point of committing fraud. As far as I understand EA was SBF’s primary guiding ideology; it is why he went down this career path of Jane Street and then starting his own companies. This post seems overly reliant on the fun fact that SBF paid more for e-sports naming rights than on EA donations to show why actually Sam didn’t care about EA that much. But these are two completely separate things! E-sports naming rights is just a means of advertising, with the goal of making FTX more money which will eventually allow SBF to donate more to EA. I think there’s also decent evidence that SBF was looking to ramp up donations in the future, as Effective Altruism continues to grow and is able to use more funding. Once you take out this fun fact about SBF’s current EA spending, I think this whole argument kind of falls apart.
I like this post much more than your previous post.
In case anyone wants a reference for the $210 million that FTX committed to spend on esports naming rights for TSM, a Washington Post article from today is here
Is there a source for the $140M figure?
The question that heads this post obviously answers itself, in that only actual perpetrators of bad deeds and their direct instigators (intellectual or otherwise) are to be held accountable for them; nevertheless, I must admit that I found Eliezer Yudkowsky' analogy unconvincing, and (not quite, but feeling a little bit) disingenuous. Whenever we see examples of adherents of some creed, ideology, religious or thought system going into nefarious places, it is natural to wonder if said ideas (whether properly or mistakenly interpreted) influenced... (read more)
We do not know her absolute state of mind when FTX (mis)used customer deposits. But, for all its wo... (read more)
I was not being serious there. It was meant to show - see, I could blame myself too, if I wanted to be silly; now don't be that silly.
When comparing the size of SBF/FTX outlay on EA vs. stuff like naming rights, I think it is important to compare apples to apples. As far as the victim's perspective, the key question is "how much money went out the door" as opposed to "how much did SBF/FTX plan or commit to spend in the future?" Although I don't know how the naming rights deals were set up, I suspect that much of the money was to be paid in the future. That means the stadiums, teams, etc. are now general unsecured creditors on any claims. I am hearing that depositor claims may be valued on the distressed-debt market at 3-5 cents on the dollar, so the claims of naming-rights counterparties are likely worth even less.
I'm very new here, just signed in today so I'm unfamiliar with all the formats at the point but I often seek to explain how biological factors can play a role in morality or our decision making because it can be useful to understand our brain's limitations among all the other factors.
Stress, isolation and the position of power have consequences for the brain. The less cooperative one has to function within their society to leads to damage in areas of the brain responsible for decision making, the anterior cingulate cortex being the most key area. It breaks... (read more)
I agree that EA likely wasn't a major causal factor for FTX/SBF's likely fraud. Unfortunately, it's a situation where even if it's not our fault it is our problem. People are trashing EA across the internet because of Sam's position in the movement. His Twitter profile pic still has him wearing an EA shirt for christ sake!
So are people who never attacked EA before suddenly doing so? That isn't what I've seen. I've seen lots of bad-faith takes about how this is proof of what they always thought, and news reporting which is about as accurate as you'd expect - that is, barely correct on the knowable facts, and misleading or confused about anything more complicated than that.
Even so, I'm still recommending people to read Terry Pratchett instead of Novik. Something something low probabil... (read more)
C'mon, if she's a true maximizer using depositors money, I guess she'd just download it from z-library
OMG, is this why z-lib was recently seized by FBI?
I chatted with an Alameda python dev for about an hour. I tried to get a sense of their testing culture, QA practices, etc. Lmao: there didn't seem to be any. Soups of scripts, no time for tests, no internal audits. Just my impression.
My type-driven and property-based testing zealot/pedant side has harvested some bayes points, unfortunately.