I dont think that all flow through effects are (or even can be) quantified by this sort of analysis. It is surely much easier to measure the growth from an investment account than to quantify the impact of a human life or the downstream effects of improving a human life. As far as I am aware, efforts to quantify this sort of effect are not yet available to the level that I would feel very confident about a side by side comparison. If you are aware of something you believe allows for such comparison, I would be very interested in reading that.
Also donating 10% to honor my GWWC pledge and investing the rest for future giving. This year was a little different than our planned giving as I was able to find several unique opportunities to donate through a social worker contact in India. The economic impact of the pandemic initially created some crisis type situations where migrant workers were unable to work or return home and many of the most vulnerable in urban slums experienced truly critical conditions. Because of our direct contact to people doing work with those people, we were reasonably conf...
I find it interesting that "giving opportunities whose primary route to impact is making more financial or human resources available to be “spent” on the highest-impact opportunities at a later point in time," were intentionally excluded. One might argue that from a longtermist perspective that the primary route to impact of most EA interventions (including those typically viewed as short-termist) will manifest most of their impact via flow through effects. The compounding effects of investing in the untapped human potential of the global poor now is...
Thanks for your comment. Please note though that most types of "flow-through effects" (including those in your example, if I understand you correctly) are included in the analysis.
Investment-like giving opportunities (as defined in the report) are only a very small subset of interventions with substantial flow-through effects, namely those whose gains are reprioritised towards the highest-impact opportunities at a later point in time. Giving to them is similar to investing to give in that both can benefit from exogenous learning.
I also think well-being is not the ideal metric for what type of development would reduce x-risk either. When I mention the Gross Nation Happiness metric this is just one measure currently being used which actually includes things like good governance and environmental impact among many other things. My point was that growth in GDP is a poor measure of success and that creating a better metric of success might be a crucial step in improving the current system. I think a measure which attempts to quantify some of the things you mention would be wonderful to
...I already think technology is at a point where welfare does not have to depend on fossil fuel consumption. This is why the efforts to have low carbon or carbon neutral development like the Global Green New Deal and other international efforts are crucial. I don’t think the western world is a model to be followed as much as it is a warning of what not to do in many ways. But yeah, I think we are already at a place where development doesn’t have to require a larger carbon footprint, we may just lack the political willpower to implement those technologies at
...I think my point is that we don’t know all that much about what that would look like. I have my own theories but they may be completely off base because this research is fairly uncommon and neglected. I think economic growth may not even be the best metric for progress but maybe some derivative of the Gross National Happiness Index or something of that nature. I do think that there may be bidirectional benefit from focusing at the intersection of x-risk and global development. I love the work ALLFED is doing BTW!
Yes, the US pandemic response in particular is evidence that the wealth of a country does not seem to be the most important factor in effective response to threats. Also, the “boring apocalypse“ scenario seems much more probable to me than any sort of “bang” or rapid extinction event and I think there is a lot that could be done in the realm of global development to help create a world more robust to that kind of slow burn.
This is great! I’m glad these things are at least on the agenda. I will be following with interest to see what comes of this.
Yes that is a major risk with this kind of thing and I cited that article in the disclaimer. I think there is almost certainly a real convergence but the strength of that convergence is what is debatable. Finding the bits where these cause areas intersect may really be a great opportunity though which EA is uniquely capable of researching and making a positive impact. So it is good to be skeptical of the convenient convergence phenomenon but that shouldn’t make us blind to ways where real convergence may be a convenient opportunity to have an outsized impact.
If you had to choose just three long-termist efforts as the highest expected value, which would you pick and why?
(Speaking for myself and not others on the team, etc)
At a very high level, I think I have mostly "mainstream longtermist EA" views here, and my current best guess would be that AI Safety, existential biosecurity, and cause prioritization (broadly construed) are the highest EV efforts to work on overall, object-level.
This does not necessarily mean that marginal progress on these things are the best use of additional resources, or that they are the most cost-effective efforts to work on, of course.