I'm the CTO of Wave, where we're bringing financial infrastructure to sub-Saharan Africa.
Personal site (incl various non-EA-related essays): https://www.benkuhn.net/
Email: ben dot s dot kuhn at the most common email address suffix
Some of your "conservative" parameter estimates are surprising to me.
For instance, your conservative estimate of the effect of diminishing marginal returns is 2% per year or 10% over 5y. If (say) the total pool of EA-aligned funds grows by 50% over the next 5 years due to additional donors joining—which seems extremely plausible—it seems like that should make the marginal opportunity much more than 10% less good.
You also wrote
we’ll stick with 5% as a conservative estimate for real expected returns on index fund investing
but used 7% as your conservative estimate in the spreadsheet and in the bottom-line estimates you reported.
I'm looking forward to CEA having a great 2020 under hopefully much more stable and certain leadership!
I’d welcome feedback on these plans via this form or in the comments, especially if you think there’s something that we’re missing or could be doing better.
This is weakly held since I don't have any context on what's going on internally with CEA right now.
That said: of the items listed in your summary of goals, it looks like about 80% of them involve inward-facing initiatives (hiring, spinoffs, process improvements, strategy), and 20% (3.3, 4.1-5) involve achieving concrete outcomes that affect things outside of CEA. The report on progress from last year also emphasized internal process improvements rather than external outcomes.
Of course, it makes sense that after a period of rapid leadership churn, it's necessary to devote some time to rebuilding and improving the organization. And if you don't have a strategy yet, I suppose it makes sense to put "develop a strategy" as your top goal and not to have very many other concrete action items.
As a bystander, though, I'll be way more excited to read about whatever you end up deciding your strategy is, than about the management improvements that currently seems to be absorbing the bulk of CEA's focus.
Hmm. You're betting based on whether the fatalities exceed the mean of Justin's implied prior, but the prior is really heavy-tailed, so it's not actually clear that your bet is positive EV for him. (e.g., "1:1 odds that you're off by an order of magnitude" would be a terrible bet for Justion because he has 2/3 credence that there will be no pandemic at all).
Justin's credence for P(a particular person gets it | it goes world scale pandemic) should also be heavy-tailed, since the spread of infections is a preferential attachment process. If (roughly, I think) the median of this distribution is 1/10 of the mean, then this bet is negative EV for Justin despite seeming generous.
In the future you could avoid this trickiness by writing a contract whose payoff is proportional to the number of deaths, rather than binary :)
I'm guessing that they assumed we were exaggerating the numbers in order to make them more interested in working with us. The fact that you're so ready to call anyone who lies about user numbers a "scammer" may itself be part of the cultural difference here :)
Examples (mostly from Senegal since that's where I have the most experience, caveat that these are generalizations, all of them could be confounded by other stuff, the world is complicated, etc.):
Exporting different norms is quite hard at scale. You need to hire people who are the closest to the norms that you want, but they'll still probably be fare away so you'll also have to invest a lot in propagating the norms you want, which only really works well 1-on-1. When we needed to scale our local Senegal team quickly we ended up having to compromise on some norms to do so (e.g. salary transparency, amount of paperwork).
Broadly agree, but:
You might end up making more impact if you started a startup in your own country, and just earned-to-give your earnings to GiveWell / EA organizations. This is because I think there are very few startups that benefit the poorest of the poor, since the poorest people don't even have access to basic needs.
Can't you just provide people basic needs then though? Many of Wave's clients have no smartphone and can't read. Low-cost Android phones (e.g. Tecno Mobile) probably provided a lot of value to people who previously didn't have smartphones. Providing people cell service is hard (if you're not a telecom), but if an area has cell service but no internet you can still make useful information products with USSD, SMS, etc., or physical shops.
(I do think that many good startup ideas in the developing world involve providing relatively "basic" needs! But it seems to me like there's decent opportunity there.)
Haha this is probably the first time someone said that about one of my essays—I’m flattered, and excited to potentially write follow ups!
Is there anything in particular you’re curious about? Sometimes it’s hard to be sure of what’s novel vs obvious/common knowledge.
I imagine that there a large fraction of EAs who expect to be more productive in direct work than in an ETG role. But I'm not too clear why we should believe that. The skills and manpower needed by EA organizations appear to be a small subset of the total careers that the world needs, and it would seem an odd coincidence if the comparative advantage of people who believe in EA happens to overlap heavily with the needs of EA organizations. Remember that EA principles suggest that you should donate to approximately one charity (i.e. the current best one). The same general idea applies to need for talent: there are a relatively small number of tasks that stand out as unusually in need of more talent.
The "one charity" argument is only true on the margin. It would be incorrect to conclude from this that nobody should start additional charities—for instance, even though GiveWell's current highest-priority gap is AMF, I'm still glad that Malaria Consortium exists so that it could absorb $25m from them earlier this year. Similarly, it's incorrect to conclude from this style of argument that the social returns to talent should be concentrated in specific fields. While there may be a small number of "most important tasks" on the margin, the EA community is now big enough that we might expect to see margins changing over time.
Also, the majority of people who are earning to give would probably be able to fund less than one person doing direct work. If your direct work would be mostly non-replaceable, then this compares unfavorably to direct work. (Seems like e.g. 80k thinks that on the current margin, people going into direct work are not too replaceable.)
If you're really worried about value drift, you might be able to use a bank account that requires two signatures to withdraw funds, and add a second signatory whom you trust to enforce your precommitment to donate?
I haven't actually tried to do this, but I know businesses sometimes have this type of control on their accounts, and it might be available to consumers too.