Hide table of contents

There's a great video by Ray Dalio, founder of the hedge fund Bridgewater, where he explains the importance of uncorrelated bets to the performance of a successful long term investing strategy:
 

 

For example, let's say you have $100 to invest, and you can pick how much money to put into either of 2 stocks: Tesla and Google, which are uncorrelated. Let's say that every year, Tesla has a 95% chance of 10x'ing, with a 5% chance of losing all of your money; and that Google has a 50% chance of giving you exactly what you invested back, and a 50% chance of doubling your money. And you have 0 uncertainty around these numbers: you're 100% sure they're correct.

If your investing strategy is to maximise (naive) expected value, you would simply put all of your money into Tesla! Tesla's expected value is 9.5 times your investment, and Google's is only 1.5 times your investment. This means you would not be investing across a portfolio of uncorrelated bets.

 

Why would picking the highest (naive) EV bet not be the best strategy? Seems like it should be!

This seems very counterintuitive at first, but there is a very simple reason why: your returns in the markets compound over time. If you put all of your money into Tesla, then your odds of having any money at all are vanishingly small as time goes on: each year, you have a 95% chance of keeping the money: you'll only have kept the money for 100 years if you hit the 95% 100 times in a row. The odds of that are 0.5%.

This kind of behaviour in a function being maximised is called a geometric sum. It's a very simple concept: rather than adding the results of your bet over and over, multiply the results over and over. Naive expected value maximisation implicitly assumes you are adding your results over and over! And in reality, most bets get multiplied, and not added, together!

When people use a log utility function, it turns out that they're accidentally solving the geometric growth maximisation problem. But I think we need to understand the phenomenon more deeply than to simply say "oh just use a log function". To what degree is your ultimate utility being multiplied vs added together across these decisions? Is there a way to remove some of the variance, and pay a small amount in expected value by doing that? 

Here's a good Wikipedia article to develop better intuition around this weird phenomenon. I think it is quite fundamental and important to understand: https://en.wikipedia.org/wiki/Kelly_criterion

 

Correlation

A similar weird thing happens with correlation. The most important reason to have a portfolio when investing is that even if some assets do poorly one year, if other assets do well, the reduction in variance of your bets could be worth a reduction in expected value overall of the portfolio per year. Long term performance comes from a reduction in variance and a high EV, not just a high EV!

 

EA

If this applies to investing money as a hedge fund, I think we need to deeply understand and apply these phenomena much more when thinking about maximising the impact of EA (EA's overall utility function). Most EAs encourage ideological diversity because of uncertainty in their belief systems: but even more importantly, ideological diversity encourages uncorrelated bets, which smooth the variance year-year, and this smoothness compounds much better. A small chance of going bust catches up eventually, even though it's a small chance.

 

What does this mean for the EA movement as a whole?

As the portfolio of resources influenced by EA grows in size, if we want a smoothly compounding movement, I strongly believe we should assign more weight to how uncorrelated a bet is. Not just because it's more interesting and nicer and more humble, but because it's simple the better strategy as you manage more and more resources. This really feels like it isn't understood well enough! 

 

Where do EA correlations come from?

An obvious way to look at correlation is by cause area. If two bets are in the same cause area, then they're correlated.

But unfortunately, correlation is much more insidious than that.

If we're all relying a lot on GiveWell to determine effective charities, all of our effective giving is highly correlated to GiveWell. What if investing in certain for profit companies was much more effective than donating to the most effective non-profit? For-profit organisations are self sustaining, and so can reach massive scale (much bigger than AMF). This is an example of an insidious correlation: hidden, until it's pointed out for you. For example, providing VC funding for the first meat substitute startup might be overlooked in the current  GiveWell paradigm: it's not a charity!

 

Final question for you

What insidious correlations do you think the EA movement suffers from today? 

Let's focus on whether a strong nonobvious and interesting correlation exists, rather than whether or not the correlation itself is justified. But don't share any strong correlations: most of them are quite obvious. Try to share strong correlations which we are unaware of until they're pointed out. That's what makes them insidious, and extremely dangerous for the overall success of the movement.

22

0
0

Reactions

0
0

More posts like this

Comments8
Sorted by Click to highlight new comments since: Today at 9:34 PM

Great post, should have more upvotes IMO, don't see many people thinking too much about this.

Thoughts on Correlations:

Too much money comes from Tech and Crypto - We should diversify EA funding into pharma, energy, healthcare, transport etc (We could do this by encouraging E2Gers to go in this direction, there are also direct impact opportunities here).

Too much focus on non-profits and not enough focus on for-profits and entrepreneurship - I've gotten more sold on for-profits recently, why? The self-reinforcing mechanism of your product funding itself can create a flywheel effect allowing for scale as fast as the possible and pushing impact that is uncapped by funders.  It's worth noting SBF was EA from the start - we should seed the next 5 SBF's to cover the other 5 (and growing) cause areas of EA.

Too Much Risk Aversion - We play by the rules too much and play it safe often, I think the iterative and empirical approach is great, we have a lot of that stock in our Portfolio as EA, what I think we do not have is that ~10% of our portfolio allocated to high-risk high reward projects. I'd like to see a culture shift into larger risk-taking and more status, money and awards for failures.

Too much stock is put into specific individuals and entities' opinions and takes - at the end of the day one person's opinion is just that, in EA as has been previously written about there is a large culture of deference and referencing. Individuals should be encouraged to think for themselves and reduce their level of deference - IMO EA has a bad culture on this front, what are the real chances 80+% of people would come to a specific conclusion on their own (which often happens in EA).

It seems to me that there's a difference between financial investment and EA bets: returns on financial bets can be then invested again, whereas returns on most EA bets are not more resources for the EA movement but are direct positive impact that helps our ultimate beneficiaries. So we can't get compounding returns from these bets.

So, except for when we're making bets to grow the resources of the EA movement, I don't think I agree that EA making correlated bets is bad in itself - we just want the highest EV bets.

Does that seem right to you?

Hmm, I don't think I agree.

 

I think the most powerful form of compounding in the EA movement context is of people and reputation, which are upstream of money and influence. Great people + great reputation -> more great people + more great reputation. 

 

Most endeavours over long periods of time have some geometric/compounding aspects, and some arithmetic aspects.

 

But usually, I think compounding is more important: that's how you avoid ruin (which isn't a big deal outside of compounding unless you use log utility which is equivalent to caring about compounding), and that's how you get really big returns.

 

Successful countries weren't built in a day. Successful charities weren't built in a day. Many things have to go right, and some things must not happen, for a movement to succeed. That's essentially just compounding. 

Good point

one might counter by saying the majority of decisions EAs make affect the reputation of ea which can then be used later. Though I doubt most org’s cost benefits are including the movements reputation change.

Also maybe there is some mechanism like the world getting better on certain dimensions unlocks ea paths that didn’t exist before. But in most cases this doesn’t seem super plausible.

I agree the argument doesn't work, but there are at least two arguments for investing in charities with sub-optimal expected values that critically depend on time.

  • Going bust. Suppose you have two charity investments with expected values . Here , but there's a potential for in the future, for instance since you receive better information about the charities. If you invest once, investing everything in is the correct answer since . Now suppose that each time you don't invest in , it has a chance of going bust. Then, if you invest more than once, it would be best to invest something in if the probability of going bust is high enough and with a sufficiently high probability.

  • Signaling effects. Not investing in the charity may signal to charity entrepreneurs that there is nothing to gain by starting in a new charity similar to , thus limiting your future pool of potential investments. I can imagine this to be especially important if your calculation of the expected value is contentious, or if has high epistemic uncertainty.

Edit: I think "going bust" example is similar to the spirit of the Kelly criterion, so I suppose you might say the argument does work.

What's the evidence that EA grantmakers are not thinking about this?

At Founders Pledge, we're thinking about this issue a lot in our climate work (indeed, we try to make negatively correlated bets in light of nonlinear climate damage) and I'd be very surprised if we were the only ones as generally thinking about uncertainty and its implications is one of the strengths of the EA movement.

The Founders Pledge climate fund's stated objective is to "sustainably reach net-zero emissions globally". 

A great example of an insidious correlation of this fund: what about funding work which helps people adapt to climate change, instead of mitigating it?  

For example, can we invent cheap air conditioning units which anyone in the world can afford to buy, to keep humans and crops cool as they migrate away from current coastal areas? 


 



EDIT: let me try to be more clear, since this answer was downvoted twice -- upon seeing the fund, I asked myself, "what belief seems to be shared by all of these investments"? That then lead me to the above thought. This is a much better intuition pump than "what should this fund be uncertain about"? I think that's the difference between uncertainty and insidious correlation, and I think you're interpreting insidious correlation as another name for uncertainty.

Thanks for the edit!

I think a better source for what I mean are my talk at SERI or the Changing Landscape report (on mobile, can't easily link, but they're both linked on the Fund page you reference).

I do think we mean the same thing, I mean that EA grantmakers do, in fact, seek to avoid these correlations by making uncorrelated or negatively correlated bets (bets where the underlying uncertainties are not correlated or negatively correlated), e.g. in our case on different ways to tackle climate that will be relatively more important when the other way fails (within innovation, diversifying between accelerating decarbonization and accelerating carbon removal, within broader theories of change such as innovation and avoiding carbon lock in, adaptation could be another one if there were promising options there (resistant grains you mention are an obvious candidate, though there are reasons to think it's not particularly neglected).

Beyond this narrow niche, if you listen to Sam Bankman Fried and why he did the Future Fund and not just enlarging the pot of Open Philanthropy and why they heavily work with regrantors, complementarity and avoiding too heavily correlated bets was also an important motivation.

Curated and popular this week
Relevant opportunities