I agree the future constraints are what mostly matter - I speculate about them in the original post.
I also agree earning to give is still useful - simply investing the money and donating when there is more capacity seems like a decent option; and medium donors can play a useful role as angel donors and people matching OP.
I think I'm less confident than you there will be convergence in the next 10yr. I think it's fairly likely that another 1-3 multibillionaires start significantly funding EA issues, which could mean the amount of funding continues to grow rapidly.The number of people needs to grow significantly faster than the amount of funding to significantly decrease the absolute size of the gap.
I do expect some convergence eventually - it seems easier to 3x or 10x the number of people than the amount of capital - though it's not obvious.
I also agree there's a decent chance we discover a fairly effective longtermist money-pit.
Definitely - but that could make the point even stronger. If it's such an outlier, maybe that means it's become easier to do something like this, which is an update in favour of trying.
I agree Alameda seemed like an unusually good opportunity at the time.
It's definitely stronger evidence of that :) Though I've also noticed EAs advancing ahead of my expectations in other areas, like government.
I basically agree with the core point.
I think recent events have been an update in favour of people in effective altruism being super talented, which means we should aim at the very top.
I also think I agree with the arguments that lower risk-aversion mean we should aim higher.
I wonder if these arguments especially bite at the ~$1bn+ project level i.e. there are a lot of startup founders aiming to found a unicorn and make $100m for themselves. But there's very little personal benefit in going from, say $10bn company to $100bn.
My main push back is that I'm not sure people should be aiming to become billionaires, given the funding situation. I'd prefer to see people aim at the top in other paths e.g. winning a nobel prize, becoming president, founding a 'megaproject' non-profit etc.
(Though, it seems like the distribution of wealth might be one of the most heavy-tailed, so the rewards of aiming high there might be better than other paths, and EAs seem perhaps unusually good at earning money.)
PS here are two threads from Sam on this topic:
Yes, my figures were proportional rather than absolute.
I was mainly responding to:
EA organizations are growing slower or at pace with the overall EA population
This sounds like a proportional claim to me. My take is they're growing at the same or faster pace as the overall EA population.
It's true that if they both grow the same proportionally, the absolute number of people not able to get jobs will grow. It's less obvious to me something is 'going wrong' if they both grow at the same rate, though it's true that the bigger the community, the more important it is to think about culture.
I agree with your main point that rejection is painful, has negative effects on the culture, and we should think about how to minimise it.
But I wanted to add that in my post about whether EA is growing, I estimate that the number of people employed in EA orgs and the number of engaged EAs have both been growing at around 20% p.a. since 2015.
If anything, in the last 1-2 years I'd guess that the number of jobs has been growing faster than the total number of people.
There was a period maybe around 2015-2018 when the number of people was more likely to have been growing faster than the number of jobs, but I don't think that's happening right now.
I agree more case studies would be great. Unfortunately I don't think producing them is going to be at the top of our stack at 80k for at least a year - right now we're focused on producing content aimed at attracting new readers, and we haven't generally found this type of material is the best for that.
If someone on the forum would like to write a study of their own career though (or interview someone else), I think that could be a pretty useful piece of content. We'd be interested in incorporating them into our planning process, which could really use more worked examples (and could later develop into the practical kind of book you're outlining).
I made a mistake in counting the number of committed community members.
I thought the Rethink estimate of the number of ~7,000 'active' members was for people who answered 4 or 5 out of 5 on the engagement scale in the EA survey, but actually it was for people who answered 3, 4 or 5.
The number of people who answered 4 or 5 is only ~2,300.
I've now added both figures to the post.
Hi Aidan, the short answer is that global poverty seems the most funding constrained of the EA causes. The skill bottlenecks are most severe in longtermism and meta e.g. at the top of the 'implications section' I said:
The existence of a funding overhang within meta and longtermist causes created a bottleneck for the skills needed to deploy EA funds, especially in ways that are hard for people who don’t deeply identify with the mindset.
That said, I still thinking global poverty is 'talent constrained' in the sense that: