.
Happy to chat about my experience in quant trading, living in Chicago/London
For each person in a leadership role, there’s typically a need for at least several people in the more junior versions of these roles or supporting positions — e.g. research assistants, operations specialists, marketers, ML engineers,...I’d typically prefer someone in these roles to an additional person donating $400,000–$4 million per year
If this is true, why not spend way more on recruiting and wages? It's surprising to me that the upper bound could be so much larger than equivalent salary in the for-profit sector.
I might be missing something, but it seems to me the basic implication of the funding overhang is that EA should convert more of its money into 'talent' (via Meta spending or just paying more).
I second these suggestions. To get more specific re cause areas:
Borrowing money if short timelines seems reasonable but, as others have said, I'm not at all convinced that betting on long-term interest rates is the right move. In part for this reason, I don't think we should read financial markets as asserting much at all about AI timelines. A couple of more specific points:
Remember: if real interest rates are wrong, all financial assets are mispriced. If real interest rates “should” rise three percentage points or more, that is easily hundreds of billions of dollars worth of revaluations. It is unlikely that sharp market participants are leaving billions of dollars on the table.
(a) The trade you're suggesting could take decades to pay off, and in the meantime might incur significant drawdown. It's not at all clear that this would be a prudent use of capital for 'sharp money'.
(b) Even if we suppose that sharps want to bet on this, that bet would be a fraction of their capital, which in turn is a fraction of the total capital in financial markets. If all of the world's financial assets are mispriced, as you say, why should we expect this to make a dent?
There are notable examples of markets seeming to be eerily good at forecasting hard-to-anticipate events:
Setting aside that the examples given are inapposite[1], surely there are plenty in both directions? To pick just one notable counterexample: The S&P 500 broke new all-time highs in mid-Feb 2020, only to crash 32% the following month, then rise 70% over the following year. So markets did a very poor job of forecasting COVID, as well as the subsequent response, on a time horizon of just a few months!
Both of these were in rapid response to recent major events (albeit ahead of common wisdom), as opposed to an abstract prediction years in the future
I'm definitely not suggesting a 98% chance of zero, but I do expect the 98% rejected to fare much worse than the 2% accepted on average, yes. The data as well as your interpretation show steeply declining returns even within that top 2%.
I don't think I implied anything in particular about the qualification level of the average EA. I'm just noting that, given the skewedness of this data, there's an important difference between just clearing the YC bar and being representative of that central estimate.
A couple of nitpicky things, which I don't think change the bottom line, and have opposing sign in any case:
I worry that this presents the case for entrepreneurship as much stronger than it is[1]
So at best, if a founder is accepted into YC, and talented enough to have the same odds of success as a random prior YC founder, $4M/yr might be a reasonable estimate of the EV from that point. But I guess my model is more like Stripe and Instacart had great product market fit and talented founders, and this can make a marginal YC startup look much more valuable than it is.
Yeah I think we're on the same page, my point is just that it only takes a single digit multiple to swamp that consideration, and my model is that charities aren't usually that close. For example, GiveWell thinks its top charities are ~8x GiveDirectly, so taken at face value a match that displaces 1:1 from GiveDirectly would be 88% as good as a 'pure counterfactual'
There are still funds remaining, but it looks like each person can only set up three matched donations
Some of the comments here are suggesting that there is in fact tension between promoting donations and direct work. The implication seems to be that while donations are highly effective in absolute terms, we should intentionally downplay this fact for fear that too many people might 'settle' for earning to give.
Personally, I would much rather employ honest messaging and allow people to assess the tradeoffs for their individual situation. I also think it's important to bear in mind that downplaying cuts both ways—as Michael points out, the meme that direct work is overwhelmingly effective has done harm.
There may be some who 'settle' for earning to give when direct work could have been more impactful, and there may be some who take away that donations are trivial and do neither. Obviously I would expect the former to be hugely overrepresented on the EA Forum.