Epistemic status: I don't actually know whether EA is over-invested in crypto. This post is intended to spark discussion in the topic.

This is a short post on:

  • The diminishing marginal utility of money in terms of giving opportunities,
  • the implications on investment strategies for EA money,
  • and a question about our best guesses at the actual diminishing returns on investment in EA.

(This post is inspired by crypto, and in practice is about crypto, but isn't necessarily about crypto)


I've had a few friends ask me how much money they should invest in crypto. My answer is that this depends on two things. One, is the hard question of the EV (expected value) of the bet. But the second thing is, even given a positive EV bet in terms of money, it might not be positive in terms of utility (to the individual).

This is because of the diminishing marginal utility of money. Imagine for example a "Person A" who is skint - their quality of life is relatively low and even a small amount of money would substantially increase their utility. Then, imagine a "Person B" with a $1M net wealth - they probably have a pretty good life. Lastly, imagine a "Person C" with a $2M net wealth. How much impact did that extra $1M make in terms of quality life? Not nearly as much as the first $1M.

On average, each additional dollar you get has less impact than the last on your quality of life (although the curve might have steps in it, where you get enough for a car or a house etc.)

This has an impact on the kinds of bets you can (as a selfish actor) rationally take. For example, let's say Person B has the opportunity to take the following bet:

  • Invest $1,000,000 to get:
    • a 50% chance at gaining an extra $1,100,000, or
    • a 50% chance at losing the $1,000,000.

We can see the EV of this is $100,000. However, Person B should not take the bet - the EU (expected utility) is probably negative, because the -$1M outcome is much more negative in utility than the +$1.1M is positive.

This is pretty intuitive for individuals, but the same thing is true for the EA movement as a whole - just at much bigger amounts of money.

Let's say EA has the opportunity to take a positive EV bet with 50% of all wealth pledged to EA (Say $20B?). At what level of payoff is that bet worth it? Lets say the bet is a 50/50 triple-or-nothing bet. So, either EA ends up with half its money, or ends up with double. I'd guess (based on not much) that right now losing 50% of EA's money is more negative than doubling EA's money is positive. What about 4x or 10x returns?

This isn't a theoretical question. EA already has a lot of money held in crypto, both through Sam-Bankman Fried's holdings and through many individual EAs' portfolios. Crypto is a notoriously high-risk / high-reward asset class and holding wealth in crypto for patient philanthropy is implicitly taking a high-risk bet. We should be explicit about whether we think that's a good idea or not.

I certainly don't know the answers, but I am interested to hear people's guesses on the actual "$$ to utility curve" for EA's altruistic giving opportunities. I'm also interested in how much EA money is currently held in high-risk investments, especially crypto.

17

10 comments, sorted by Click to highlight new comments since: Today at 11:19 AM
New Comment

EA isn't a superorganism where everyone (and all organizations) share the same empirical assumptions (or normative views). This complicates the analysis because money held by people who do grantmaking one doesn't consider important should already be discounted.

edit: In the current funding landscape, my point isn't very relevant because it seems relatively easy to get funding if someone's project looks strong on some plausible assumptions. However, in the future, we could imagine scenarios where large sums of money are deployed for ambitious strategies where some people think the strategy is good and others think it might be too risky. Example: Buying compute for an AI company where some funders think the company has enough of a safety mindset, while other funders would prefer to wait and evaluate. 

That's a great point on disagreement on the effectiveness of the interventions themselves (rather than the investments). I'm not really sure how to think about that. I think we do already have a process for figuring out the allocation by hedging against other people's "intervention portfolios" in the same way as is suggested for investment portfolios below.

For example, if I think LTFF is overemphasizing ai-risk, I can directly donate or offer to fund bio-risk, instead of donating via LTFF.

At what level of payoff is that bet worth it? Lets say the bet is a 50/50 triple-or-nothing bet. So, either EA ends up with half its money, or ends up with double. I'd guess (based on not much) that right now losing 50% of EA's money is more negative than doubling EA's money is positive.


There is an actual correct answer, at least the abstract. According to the Kelly criterion, on a 50/50 triple-or-nothing bet, you should put down 25% of your bankroll.

Say EA is now at around 50/50 Crypto/non-Crypto, what kind of returns would justify that allocation? At 50/50 odds, there's actually no multiple that makes the math work out.

But that's just for the strict case we're discussing. See the section on "Investment formula" for what to do about partial losses.

Finally, instead of a 50/50 triple-or-nothing bet, we can model this as a 75/25 double-or-nothing bet (same EV as you bet). In that case, we get that a 50/50 allocation is optimal.

But note that the Kelly criterion is optimizing for log(wealth)! Log(wealth) approximates utility in individuals, but not in aggregate. Since EA is trying to give all its money away, the marginal returns slope off much more gradually. (See some very rough estimates here.) If you're just optimizing for wealth, you would be okay with a riskier allocation.

BTW, it's not just "over-invested in X", you have to think about the entire portfolio. So given that almost all EA money is either Sam or Dustin, you have to consider the correlation between Crypto and FB stock.

I'll also add that you have to consider all future EA money in determining what % of the bankroll we're using.

It doesn't really matter though, since EA doesn't "own" or "control" Sam's wealth in any meaningful way.

Are you talking about individuals who are EA, or funds?

For individuals:

 - Depending on your net worth, you may prioritise maximising your own wealth as a safety net, before considering altruistic factors. So the standard stuff still applies.

 - Is impacting twice as many people really twice as good? Are Pascal's wagers worth taking? I wonder if the feel-good factor you get from impacting people follows the same psychological law of diminishing returns as making wealth for yourself.

For funds:

 - You're right theycould attempt to know (or guess) what their present and future donors are invested in, bonus points if they can get (legal or informal) commitments from the donors. And then maybe use this information to hedge against individual donors who they believe are making irrational trades, or else are too far out on the risk curve. (Then again should funds devoted to EA have the authority to decide this top-down, or is delegating these choices to donors better?) 

But there is lower hanging fruit, afaik EA Funds doesn't invest their money at all, not even in a stock index. Idk about other funds dedicated to EA.

P.S. I wish there was a phrase for "funds devoted to EA but not the org, EA Funds". EA Funds calling themselves that is confusing to write about and probably acts to centralise the ecosystem (for better or worse).

Are you talking about individuals who are EA, or funds?

Mainly funds, but also individuals (especially wealthy individuals)

Imagine we were coordinating among everyone doing patient philanthropy (patient philanthropy being any situation where you keep wealth instead of liquidating it and giving it away to the current best opportunity that can use the money) and then optimising the overall portfolio. We're asking the question: In what ways would we choose a different portfolio? I'm just explaining one of the considerations which is the diminishing marginal utility of the whole pool.

[...] bonus points if they can get (legal or informal) commitments from the donors. And then maybe use this information to hedge against individual donors who they believe are making irrational trades, or else are too far out on the risk curve. (Then again should funds devoted to EA have the authority to decide this top-down, or is delegating these choices to donors better?)

I agree, this is exactly what I'm looking for from these comments! This is a pretty reasonable way that we can coordinate on the overall portfolio.

For anyone reading, here's how this would work. Let's say Fund X thinks Rich Individual Y is overly bullish on crypto, and that Fund X's inside view is that EA should only be 20% invested in crypto. Fund X, instead of investing 20% of their funds in crypto, might want to put all their money in other investments instead, to bring the overall EA portfolio closer to 20%.

In general, if people reveal their donation intentions and investment portfolios, we can do a better job hedging against each other to optimise the overall portfolio.

Also, if we come up with a diminishing marginal returns curve, people can judge the optimality of a given portfolio better. @AppliedDivinityStudies has posted a link and thoughts on this part of the discussion.

"[...] afaik EA Funds doesn't invest their money at all"

Wow - This seems pretty bad, unless in practice all their holding are temporary (ie. They go down to approximately $0 more than once a year). I'm pretty that the optimal portfolio doesn't include much cash, given that cash appreciates in real value at the wonderful interest rate of negative 2-3% (negative 6% last year?)

Let's say Fund X thinks Rich Individual Y is overly bullish on crypto, and that Fund X's inside view is that EA should only be 20% invested in crypto. Fund X, instead of investing 20% of their funds in crypto, might want to put all their money in other investments instead, to bring the overall EA portfolio closer to 20%.

This will require individual Y to be comfortable with fund X thinking and acting this way, else individual Y won't dislose portfolio info or maybe not donate to X.

I think it's a model that should be tried but yeah there's more considerations that need to be ironed out.

Wow - This seems pretty bad,

Yep, I think they're just holding cash, best if someone confirms.

It's a reason for a donor to prefer a recipient org over EA funds directly, cause recipient orgs tell you in advance how much funding they need and spend a higher fraction of what they receive.

I think you should have made this post a question. It being a post made me think you actually had an answer, so I read it, and was disappointed you didn’t actually conclude anything.

I was thinking about this too (and tried to signal it was a question, rather than an answer). But since I think that no one has an answer and it's more a post designed to spur discussion, I made it a post.

Your reasoning seems reasonable in the absence of evidence. I don't know how you were trying to signal it was a question (other than the question-mark in the title, which almost never indicates the intent to simply provoke discussion, and more often means "here is the question I explored in a research project, the details & conclusions of which will follow"). Instead, I think you should have had maybe an epistemic status disclaimer near the beginning. Something like,

Epistemic status: I don't actually know whether EA is over-invested in crypto. This post is intended to spark discussion in the topic.

Perfect, that's what I'm looking for