Hide table of contents

Here is a quote from SBF from Conversations With Tyler. Emphasis mine for skimmability. https://conversationswithtyler.com/episodes/sam-bankman-fried/

COWEN: Okay, but let’s say there’s a game: 51 percent, you double the Earth out somewhere else; 49 percent, it all disappears. Would you play that game? And would you keep on playing that, double or nothing? 
BANKMAN-FRIED: With one caveat. Let me give the caveat first, just to be a party pooper, which is, I’m assuming these are noninteracting universes. Is that right? Because to the extent they’re in the same universe, then maybe duplicating doesn’t actually double the value because maybe they would have colonized the other one anyway, eventually. 
COWEN: But holding all that constant, you’re actually getting two Earths, but you’re risking a 49 percent chance of it all disappearing. 
BANKMAN-FRIED: Again, I feel compelled to say caveats here, like, “How do you really know that’s what’s happening?” Blah, blah, blah, whatever. But that aside, take the pure hypothetical. 
COWEN: Then you keep on playing the game. So, what’s the chance we’re left with anything? Don’t I just St. Petersburg paradox you into nonexistence? 
BANKMAN-FRIED: Well, not necessarily. Maybe you St. Petersburg paradox into an enormously valuable existence. That’s the other option.

----

1) Is this quote from SBF aligned with EA? ("not particularly aligned or unaligned" is ofc valid response)
2) Regardless of your first answer, can you articulate a value system or system of ethics under which the game described by Tyler is moral to play (ad infinitum), but it is not moral to risk 80% of FTX depositor funds with 75% odds of doubling the money and donating all of it to effective charity (once, much less ad infinitum).

Bonus question: Did you find SBF's response to this question surprising? Do you think that most leaders in the EA community would have found SBF's response to this question surprising?

15

0
0

Reactions

0
0
New Answer
New Comment

1 Answers sorted by

I frankly find this response, at first glance, to be not nearly risk-averse enough for my tastes, but I can't for the life of me find a compelling alternative that is not vulnerable to some sort of counterintuitive conclusion. So my response is generally "let he who is not Dutch-bookable cast the first stone."

Those who maximize expected utility with unbounded utility functions are Dutch-bookable in principle. Here's an example where you're forced into a stochastically dominated strategy with just one decision: you trade in a St. Petersburg lottery outcome for a new St. Petersburg lottery that's ex ante worse than the first lottery was before you saw its outcome, no matter the outcome.

There are also 100% guaranteed loss Dutch book arguments with infinitely many decisions, like McGee, 1999  and Pruss, 2022.

Curated and popular this week
Relevant opportunities